959 resultados para default externalities
Resumo:
The purpose of this article is to delimit the role of pragmatic specialization in the evolution of negation in French. The change in the marking of sentential negation is believed to proceed in characterized stages that would together constitute the Jespersen cycle. As a marker becomes the default expression of negation, the other markers do not necessarily fade away, and are maintained with specialized roles that include pragmatic functions. One such pragmatic function is that of activation (Dryer 1996), by which a proposition is presented as accessible to the hearer. Activation is shown to motivate the use of preverbal non that competes with 'ne' for several centuries. The claims that the emergence of postverbal pas in early French and the loss of 'ne' in contemporary spoken French are associated with activation are considered on the basis of novel data. It is concluded that pragmatic functions contribute to language change by providing marked options that may be conferred the default status in a grammatical paradigm.
Resumo:
Discussion of open innovation has typically stressed the benefits to the individual enterprise from boundary-spanning linkages and improved internal knowledge sharing. In this paper we explore the potential for wider benefits from openness in innovation and argue that openness may itself generate positive externalities by enabling improved knowledge diffusion. The potential for these (positive) externalities suggests a divergence between the private and social returns to openness and the potential for a sub-optimal level of openness where this is determined purely by firms' private returns. Our analysis is based on Irish plant-level panel data from manufacturing industry over the period 1994-2008. Based on instrumental variables regression models our results suggest that externalities of openness in innovation are significant and that they are positively associated with firms' innovation performance. We find that these externality effects are unlikely to work through their effect on the spread of open innovation practices. Instead, they appear to positively influence innovation outputs by either increasing knowledge diffusion or strengthening competition. Our evidence on the significance of externalities from openness in innovation provides a rationale for public policy aimed at promoting open innovation practices among firms. © 2013 Elsevier B.V. All rights reserved.
Resumo:
The "recursive" definition of Default Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the "recursive" fixed-point equation of Default Logic with an initial set of axioms and defaults if and only if the meaning of the fixed-point is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original "recursive" definition of Default Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults.
Resumo:
The nonmonotonic logic called Default Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the fixed-point equation of Default Logic with an initial set of axioms and defaults if and only if the meaning or rather disquotation of that set of sentences is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This result is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original Default Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults thus allowing such defaults to produce quantified consequences. Furthermore, this generalization properly treats such quantifiers since both the Barcan Formula and its converse hold.
Resumo:
Reflective Logic and Default Logic are both generalized so as to allow universally quantified variables to cross modal scopes whereby the Barcan formula and its converse hold. This is done by representing both the fixed-point equation for Reflective Logic and the fixed-point equation for Default both as necessary equivalences in the Modal Quantificational Logic Z. and then inserting universal quantifiers before the defaults. The two resulting systems, called Quantified Reflective Logic and Quantified Default Logic, are then compared by deriving metatheorems of Z that express their relationships. The main result is to show that every solution to the equivalence for Quantified Default Logic is a strongly grounded solution to the equivalence for Quantified Reflective Logic. It is further shown that Quantified Reflective Logic and Quantified Default Logic have exactly the same solutions when no default has an entailment condition.
Resumo:
This study extends the Grullon, Michaely, and Swaminathan (2002) analysis by incorporating default risk. Using data for firms that either increased or initiated cash dividend payments during the 23-year period 1986-2008, we find reduction in default risk. This reduction is shown to be a priced risk factor beyond the Fama and French (1993) risk measures, and it explains the dividend payment decision and the positive market reaction around dividend increases and initiations. Further analysis reveals that the reduction in default risk is a significant factor in explaining the 3-year excess returns following dividend increases and initiations. © Copyright Michael G. Foster School of Business, University of Washington 2011.
Resumo:
With the determination of principal parameters of producing and pollution abatement technologies, this paper quantifies abatement and external costs at the social optimum and analyses the dynamic relationship between technological development and the above-mentioned costs. With the partial analysis of parameters, the paper presents the impacts on the level of pollution and external costs of extensive and intensive environmental protection, market demand change and product fees, and not environmental protection oriented technological development. Parametrical cost calculation makes the drawing up of two useful rules of thumb possible in connection with the rate of government in-terventions. Also, the paradox of technological development aiming at intensive environmental protection will become apparent.
Resumo:
We examine assignment games, wherematched pairs of firms and workers create some monetary value to distribute among themselves and the agents aim to maximize their payoff. In the majority of this literature, externalities - in the sense that a pair’s value depends on the pairing of the others - have been neglected. However, inmost applications a firm’s success depends on, say, the success of its rivals and suppliers. Thus, it is natural to ask how the classical results on assignment games are affected by the introduction of externalities? The answer is – dramatically. We find that (i) a problem may have no stable outcome, (ii) stable outcomes can be inefficient (not maximize total value), (iii) efficient outcomes can be unstable, and (iv) the set of stable outcomes may not form a lattice. We show that stable outcomes always exist if agents are "pessimistic." This is a knife-edge result: there are problems in which the slightest optimism by a single pair erases all stable outcomes.
Resumo:
This study assesses the Pigou taxes introduced as a response to negative externalities in practice. The authors analyze the international practice and effectiveness of taxation on food products harmful to health and on carbon emissions harmful to the environment and, in relation to these two types of taxes, the focus is on the opportunities and the factors reducing efficiency.
Resumo:
In this article we analyze asymmetric two-sided markets. Two types of agents are assumed to interact with each other and we assume that agents of one type derive utility from inter-group interactions, while the other type of agents benefit from intra-group rather than from inter-group interactions as it is assumed in the standard symmetric two-sided markets model. First, we consider a monopoly platform, then we analyze competing platforms, both with single-homing and multi-homing abilities.
Resumo:
The research reported here is supported by the award made by the RCUK Digital Economy programme to the dot.rural Digital Economy Hub [award reference: EP/G066051/1].
Resumo:
I explore and analyze a problem of finding the socially optimal capital requirements for financial institutions considering two distinct channels of contagion: direct exposures among the institutions, as represented by a network and fire sales externalities, which reflect the negative price impact of massive liquidation of assets.These two channels amplify shocks from individual financial institutions to the financial system as a whole and thus increase the risk of joint defaults amongst the interconnected financial institutions; this is often referred to as systemic risk. In the model, there is a trade-off between reducing systemic risk and raising the capital requirements of the financial institutions. The policymaker considers this trade-off and determines the optimal capital requirements for individual financial institutions. I provide a method for finding and analyzing the optimal capital requirements that can be applied to arbitrary network structures and arbitrary distributions of investment returns.
In particular, I first consider a network model consisting only of direct exposures and show that the optimal capital requirements can be found by solving a stochastic linear programming problem. I then extend the analysis to financial networks with default costs and show the optimal capital requirements can be found by solving a stochastic mixed integer programming problem. The computational complexity of this problem poses a challenge, and I develop an iterative algorithm that can be efficiently executed. I show that the iterative algorithm leads to solutions that are nearly optimal by comparing it with lower bounds based on a dual approach. I also show that the iterative algorithm converges to the optimal solution.
Finally, I incorporate fire sales externalities into the model. In particular, I am able to extend the analysis of systemic risk and the optimal capital requirements with a single illiquid asset to a model with multiple illiquid assets. The model with multiple illiquid assets incorporates liquidation rules used by the banks. I provide an optimization formulation whose solution provides the equilibrium payments for a given liquidation rule.
I further show that the socially optimal capital problem using the ``socially optimal liquidation" and prioritized liquidation rules can be formulated as a convex and convex mixed integer problem, respectively. Finally, I illustrate the results of the methodology on numerical examples and
discuss some implications for capital regulation policy and stress testing.