162 resultados para Mixed complementarity problem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This note develops a flexible methodology for splicing economic time series that avoids the extreme assumptions implicit in the procedures most commonly used in the literature. It allows the user to split the required correction to the older of the series being linked between its levels and growth rates on the basis what he knows or conjectures about the persistence of the factors that account for the discrepancy between the two series that emerges at their linking point. The time profile of the correction is derived from the assumption that the error in the older series reflects the inadequate coverage of emerging sectors or activities that grow faster than the aggregate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The division problem consists of allocating a given amount of an homogeneous and perfectly divisible good among a group of agents with single-peaked preferences on the set of their potential shares. A rule proposes a vector of shares for each division problem. The literature has implicitly assumed that agents will find acceptable any share they are assigned to. In this paper we consider the division problem when agents' participation is voluntary. Each agent has an idiosyncratic interval of acceptable shares where his preferences are single-peaked. A rule has to propose to each agent either to not participate or an acceptable share because otherwise he would opt out and this would require to reassign some of the remaining agents' shares. We study a subclass of efficient and consistent rules and characterize extensions of the uniform rule that deal explicitly with agents' voluntary participation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La acelerada invención de nuevos hardware y software van modificando, casi diariamente, la percepción del mundo, y, por lo tanto, la producción cultural, permeabilizando conceptos como arte-literatura, cuadro-libro, imagen-texto. Si bien estas parejas han sido siempre objeto del discurso teórico, la discusión asume hoy una urgencia creciente al momento que las nuevas tecnologías exponen lo que estaba refugiado en el reino de la teoría. La misma forma de comprender la realidad se ve afectada por la inmediatez de estos medios. La investigación analiza la obra de diferentes autores de los nuevos medios que trabajan en torno a la problemática de la representación de la memoria en esta perspectiva contemporánea. El trabajo de investigación desarrollado en la Tesis Doctoral se centra en la forma de representación de la memoria, así como esta planteada en la obra de Chris Marker. Interesan especialmente los últimos dispositivos creados por el autor en el marco de las llamadas nuevas tecnologías y los nuevos espacios de exposición de cine. El proyecto propone un análisis en torno a la memoria que dichos discursos sugieren a través de los temas que les son propios: archivo, identidades culturales, contribución del espectador, base de datos y tratamiento tecnológico de la información. Se ha seleccionado la obra de Chris Marker por las características de realización y de discurso que permiten una amplia discusión sobre las llamadas nuevas tecnologías y el mundo que éstas representan en el nuevo espacio híbrido construido entre las artes visuales, la literatura y la tecnología.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the role of earnings quality in determining the levels of segment disclosure, and whether and how better quality earnings and segment disclosure influences cost of capital. Using a large US sample for the period 2001-2006, we find a positive relation between earnings quality and levels of segment disclosures. We also find that firms providing better quality segment information, contingent upon good earnings quality, enjoy lower cost of capital. We base our empirical tests on a self created index of segment disclosure. Our results contribute to a better understanding of (1) the incentives for providing segment disclosures, and (2) how accounting quality (quality of segment information and earnings quality) is related to the cost of capital.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study a general static noisy rational expectations model where investors have private information about asset payoffs, with common and private components, and about their own exposure to an aggregate risk factor, and derive conditions for existence and uniqueness (or multiplicity) of equilibria. We find that a main driver of the characterization of equilibria is whether the actions of investors are strategic substitutes or complements. This latter property in turn is driven by the strength of a private learning channel from prices, arising from the multidimensional sources of asymmetric information, in relation to the usual public learning channel. When the private learning channel is strong (weak) in relation to the public we have strong (weak) strategic complementarity in actions and potentially multiple (unique) equilibria. The results enable a precise characterization of whether information acquisition decisions are strategic substitutes or complements. We find that the strategic substitutability in information acquisition result obtained in Grossman and Stiglitz (1980) is robust. JEL Classification: D82, D83, G14 Keywords: Rational expectations equilibrium, asymmetric information, risk exposure, hedging, supply information, information acquisition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies endogenous mergers of complements with mixed bundling, by allowing both for joint and separate consumption. After merger, partner fi…rms decrease the price of the bundled system. Besides, when markets for individual components are suffi…ciently important, partner …firms raise prices of stand-alone products, exploiting their monopoly power in local markets and making substitute 'mix-and-match' composite products less attractive to consumers. Even though these effects favor the pro…fitability of mergers, merging is not always an equilibrium outcome. The reason is that outsiders respond by cutting their prices to retain their market share, and mergers can be unprofitable when competition is intense. From a welfare analysis, we observe that the number of mergers observed in equilibrium may be either excessive (when markets for individual components are important) or suboptimal (when markets for individual components are less important). Keywords: complements; merger; mixed bundling; separate consumption JEL classi…fication: L13; L41; D43

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies a dynamic principal-monitor-agent relation where a strategic principal delegates the task of monitoring the effort of a strategic agent to a third party. The latter we call the monitor, whose type is initially unknown. Through repeated interaction the agent might learn his type. We show that this process damages the principal's payoffs. Compensation is assumed exogenous, limiting to a great extent the provision of incentives. We go around this difficulty by introducing costly replacement strategies, i.e. the principal replaces the monitor, thus disrupting the agent's learning. We found that even when replacement costs are null, if the revealed monitor is strictly preferred by both parties, there is a loss in efficiency due to the impossibility of bene…tting from it. Nonetheless, these strategies can partially recover the principal's losses. Additionally, we establish upper and lower bounds on the payoffs that the principal and the agent can achieve. Finally we characterize the equilibrium strategies under public and private monitoring (with communication) for different cost and impatience levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study I try to explain the systemic problem of the low economic competitiveness of nuclear energy for the production of electricity by carrying out a biophysical analysis of its production process. Given the fact that neither econometric approaches nor onedimensional methods of energy analyses are effective, I introduce the concept of biophysical explanation as a quantitative analysis capable of handling the inherent ambiguity associated with the concept of energy. In particular, the quantities of energy, considered as relevant for the assessment, can only be measured and aggregated after having agreed on a pre-analytical definition of a grammar characterizing a given set of finite transformations. Using this grammar it becomes possible to provide a biophysical explanation for the low economic competitiveness of nuclear energy in the production of electricity. When comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. Since the cost of production of fossil energy provides the base line of economic competitiveness of electricity, the (lack of) economic competitiveness of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. In particular, the analysis focuses on fossil-fuel requirements and labor requirements for those phases that both nuclear plants and fossil energy plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. By adopting this approach, it becomes possible to explain the systemic low economic competitiveness of nuclear energy in the production of electricity, because of: (i) its dependence on oil, limiting its possible role as a carbon-free alternative; (ii) the choices made in relation to its fuel cycle, especially whether it includes reprocessing operations or not; (iii) the unavoidable uncertainty in the definition of the characteristics of its process; (iv) its large inertia (lack of flexibility) due to issues of time scale; and (v) its low power level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to analyse the effects of human capital, advanced manufacturing technologies (AMT), and new work organizational practices on firm productivity, while taking into account the synergies existing between them. This study expands current knowledge in this area in two ways. First, in contrast with previous works, we focus on AMT and not ICT (information and communication technologies). Second, we use a unique employer-employee data set for small firms in a particular area of southern Europe (Catalonia, Spain). Using a small firm data set, allows us to analyse the particular case of small and medium enterprises, since we cannot assume they have the same characteristics as large firms. The results provide evidence in favor of the complementarity hypothesis between human capital, advanced manufacturing technologies, and new work organization practices, although we show that the complementarity effects depend on what type of work organization practices are used by a firm. For small and medium Catalan firms, the only set of work organization practices that improve the benefits of human capital and technology investment are those practices which are more quality oriented, such as quality circles, problem-solving groups or total quality management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The generator problem was posed by Kadison in 1967, and it remains open until today. We provide a solution for the class of C*-algebras absorbing the Jiang-Su algebra Z tensorially. More precisely, we show that every unital, separable, Z-stable C*-algebra A is singly generated, which means that there exists an element x є A that is not contained in any proper sub-C*- algebra of A. To give applications of our result, we observe that Z can be embedded into the reduced group C*-algebra of a discrete group that contains a non-cyclic, free subgroup. It follows that certain tensor products with reduced group C*-algebras are singly generated. In particular, C*r (F ∞) ⨂ C*r (F ∞) is singly generated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature on local services has focused on the effects of privatization and, if anything, has compared the effects of private and mixed public-private systems versus public provision. However, alternative forms of provision such as cooperatives, which can be very prevalent in many developing countries, have been completely ignored. In this paper, we investigate the effects of communal water provison (Comités Vecinales and Juntas Administrativas de Servicios de Saneamiento) on child health in Peru. Using detailed survey data at the household- and child-level for the years 2006-2010, we exploit the cross-section variability to assess the differential impact of this form of provision. Despite controlling for a wide range of household and local characteristics, the municipalities served by communal organizations are more likely to have poorer health indicators, what would result in a downward bias on the absolute magnitude of the effect of cooperatives. We rely on an instrumental variable strategy to deal with this potential endogeneity problem, and use the personnel resources and the administrative urban/rural classi fication of the municipalities as instruments for the provision type. The results show a negative and signi cant effect of comunal water provision on diarrhea among under- five year old children. Keywords: water utilities, cooperatives, child health, regulation, Peru. JEL Classi fication Numbers: L33; L50; L95

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper shows how instructors can use the problem‐based learning method to introduce producer theory and market structure in intermediate microeconomics courses. The paper proposes a framework where different decision problems are presented to students, who are asked to imagine that they are the managers of a firm who need to solve a problem in a particular business setting. In this setting, the instructors’ role isto provide both guidance to facilitate student learning and content knowledge on a just‐in‐time basis

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The statistical analysis of literary style is the part of stylometry that compares measurable characteristicsin a text that are rarely controlled by the author, with those in other texts. When thegoal is to settle authorship questions, these characteristics should relate to the author’s style andnot to the genre, epoch or editor, and they should be such that their variation between authors islarger than the variation within comparable texts from the same author.For an overview of the literature on stylometry and some of the techniques involved, see for exampleMosteller and Wallace (1964, 82), Herdan (1964), Morton (1978), Holmes (1985), Oakes (1998) orLebart, Salem and Berry (1998).Tirant lo Blanc, a chivalry book, is the main work in catalan literature and it was hailed to be“the best book of its kind in the world” by Cervantes in Don Quixote. Considered by writterslike Vargas Llosa or Damaso Alonso to be the first modern novel in Europe, it has been translatedseveral times into Spanish, Italian and French, with modern English translations by Rosenthal(1996) and La Fontaine (1993). The main body of this book was written between 1460 and 1465,but it was not printed until 1490.There is an intense and long lasting debate around its authorship sprouting from its first edition,where its introduction states that the whole book is the work of Martorell (1413?-1468), while atthe end it is stated that the last one fourth of the book is by Galba (?-1490), after the death ofMartorell. Some of the authors that support the theory of single authorship are Riquer (1990),Chiner (1993) and Badia (1993), while some of those supporting the double authorship are Riquer(1947), Coromines (1956) and Ferrando (1995). For an overview of this debate, see Riquer (1990).Neither of the two candidate authors left any text comparable to the one under study, and thereforediscriminant analysis can not be used to help classify chapters by author. By using sample textsencompassing about ten percent of the book, and looking at word length and at the use of 44conjunctions, prepositions and articles, Ginebra and Cabos (1998) detect heterogeneities that mightindicate the existence of two authors. By analyzing the diversity of the vocabulary, Riba andGinebra (2000) estimates that stylistic boundary to be near chapter 383.Following the lead of the extensive literature, this paper looks into word length, the use of the mostfrequent words and into the use of vowels in each chapter of the book. Given that the featuresselected are categorical, that leads to three contingency tables of ordered rows and therefore tothree sequences of multinomial observations.Section 2 explores these sequences graphically, observing a clear shift in their distribution. Section 3describes the problem of the estimation of a suden change-point in those sequences, in the followingsections we propose various ways to estimate change-points in multinomial sequences; the methodin section 4 involves fitting models for polytomous data, the one in Section 5 fits gamma modelsonto the sequence of Chi-square distances between each row profiles and the average profile, theone in Section 6 fits models onto the sequence of values taken by the first component of thecorrespondence analysis as well as onto sequences of other summary measures like the averageword length. In Section 7 we fit models onto the marginal binomial sequences to identify thefeatures that distinguish the chapters before and after that boundary. Most methods rely heavilyon the use of generalized linear models

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epipolar geometry is a key point in computer vision and the fundamental matrix estimation is the only way to compute it. This article surveys several methods of fundamental matrix estimation which have been classified into linear methods, iterative methods and robust methods. All of these methods have been programmed and their accuracy analysed using real images. A summary, accompanied with experimental results, is given

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We formulate a necessary and sufficient condition for polynomials to be dense in a space of continuous functions on the real line, with respect to Bernstein's weighted uniform norm. Equivalently, for a positive finite measure [lletra "mu" minúscula de l'alfabet grec] on the real line we give a criterion for density of polynomials in Lp[lletra "mu" minúscula de l'alfabet grec entre parèntesis].