46 resultados para King James Version
Resumo:
We propose a new solution concept to address the problem of sharing a surplus among the agents generating it. The problem is formulated in the preferences-endowments space. The solution is defined recursively, incorporating notions of consistency and fairness and relying on properties satisfied by the Shapley value for Transferable Utility (TU) games. We show a solution exists, and call it the Ordinal Shapley value (OSV). We characterize the OSV using the notion of coalitional dividends, and furthermore show it is monotone and anonymous. Finally, similarly to the weighted Shapely value for TU games, we construct a weighted OSV as well.
Resumo:
The paper provides a description and analysis of the Hodgskin section of Theories of Surplus Value and the general law section of the first version of Volume III of Capital. It then considers Part III of Volume III, the evolution of Marx's thought and various interpretations of his theory in the light of this analysis. It is suggested that Marx thought that the rate of profit must fall and even in the 1870s hoped to be able to provide a demonstration of this. However the main conclusions are: 1. Marx's major attempt to show that the rate of profit must fall occurred in the general law section. 2. Part III does not contain a demonstration that the rate of profit must fall. 3. Marx was never able to demonstrate that the rate of profit must fall and he was aware of this.
Resumo:
This paper examines, both descriptively and analytically, Marx's arguments for the falling rate of profit from the Hodgskin section of Theories of Surplus Value, The General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. The conclusions are as follows: First, Marx realised that his main attempt to give an intrinsic explanation of the falling rate of profit, which occurred in the General Law section, had failed; but he still hoped that he would be able to demonstrate it in the future. Second, the Hodgskin and General Law sections contain a number of subsidiary explanations, mostly related to resource scarcity, some of which are correct. Third, Part III of volume III does not contain a demonstration of the falling rate of profit, but a description of the role of the falling rate of profit in capitalist development. Forth, it also contains suppressed references to resource scarcity. And finally, in Chapter 3 of Volume III, Marx says that it is resource scarcity that causes the fall in the rate of profit described in Part III of the same volume. The key to all these conclusions in the careful analysis of the General Law section.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.
Resumo:
Marxs conclusions about the falling rate of profit have been analysed exhaustively. Usually this has been done by building models which broadly conform to Marxs views and then showing that his conclusions are either correct or, more frequently, that they can not be sustained. By contrast, this paper examines, both descriptively and analytically, Marxs arguments from the Hodgskin section of Theories of Surplus Value, the General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. It also gives a new interpretation of Part III of this last work. The main conclusions are first, that Marx had an intrinsic explanation of the falling rate of profit but was unable to give it a satisfactory demonstration and second, that he had a number of subsidiary explanations of which the most important was resource scarcity. The paper closes with an assessment of the pedigree of various currents of Marxian thought on this issue.
Resumo:
We prove the non-emptiness of the core of an NTU game satisfying a condition of payoff-dependent balancedness, based on transfer rate mappings. We also define a new equilibrium condition on transfer rates and we prove the existence of core payoff vectors satisfying this condition. The additional requirement of transfer rate equilibrium refines the core concept and allows the selection of specific core payoff vectors. Lastly, the class of parametrized cooperative games is introduced. This new setting and its associated equilibrium-core solution extend the usual cooperative game framework and core solution to situations depending on an exogenous environment. A non-emptiness result for the equilibrium-core is also provided in the context of a parametrized cooperative game. Our proofs borrow mathematical tools and geometric constructions from general equilibrium theory with non convexities. Applications to extant results taken from game theory and economic theory are given.
Resumo:
We study optimal contracts in a simple model where employees are averse to inequity as modelled by Fehr and Schmidt (1999). A "selfish" employer can profitably exploit such preferences among its employees by offering contracts which create inequity off-equilibrium and thus, they would leave employees feeling envy or guilt when they do not meet the employer's demands. Such contracts resemble team and relative performance contracts, and thus we derive conditions under which it may be beneficial to form work teams of employees with distributional concerns who were previously working individually. Similar results are obtained for status-seeking and efficiency concerns preferences.
Resumo:
This document, originally published as part of the book The Keys of success: the social, sporting, economic and communications impact of Barcelona’92, comes from a larger study that looked at all aspects of television in the Olympics and can be found in its original version, in Miquel de Moragas Spà, Nancy K. Rivenburgh and James F. Larson (1996). Television in the Olympics. London: John Libbey.
Resumo:
Estudi comparatiu entre 'Paths of Glory'(S.Kubrick) i 'King and Country' (J.Losey) : Dos films crítics sobre la Primera Guerra Mundial amb una temàtica molt similar però que aporten un punt de vista diferent.
Resumo:
The statistical analysis of compositional data should be treated using logratios of parts,which are difficult to use correctly in standard statistical packages. For this reason afreeware package, named CoDaPack was created. This software implements most of thebasic statistical methods suitable for compositional data.In this paper we describe the new version of the package that now is calledCoDaPack3D. It is developed in Visual Basic for applications (associated with Excel©),Visual Basic and Open GL, and it is oriented towards users with a minimum knowledgeof computers with the aim at being simple and easy to use.This new version includes new graphical output in 2D and 3D. These outputs could bezoomed and, in 3D, rotated. Also a customization menu is included and outputs couldbe saved in jpeg format. Also this new version includes an interactive help and alldialog windows have been improved in order to facilitate its use.To use CoDaPack one has to access Excel© and introduce the data in a standardspreadsheet. These should be organized as a matrix where Excel© rows correspond tothe observations and columns to the parts. The user executes macros that returnnumerical or graphical results. There are two kinds of numerical results: new variablesand descriptive statistics, and both appear on the same sheet. Graphical output appearsin independent windows. In the present version there are 8 menus, with a total of 38submenus which, after some dialogue, directly call the corresponding macro. Thedialogues ask the user to input variables and further parameters needed, as well aswhere to put these results. The web site http://ima.udg.es/CoDaPack contains thisfreeware package and only Microsoft Excel© under Microsoft Windows© is required torun the software.Kew words: Compositional data Analysis, Software
Resumo:
In the eighties, John Aitchison (1986) developed a new methodological approach for the statistical analysis of compositional data. This new methodology was implemented in Basic routines grouped under the name CODA and later NEWCODA inMatlab (Aitchison, 1997). After that, several other authors have published extensions to this methodology: Marín-Fernández and others (2000), Barceló-Vidal and others (2001), Pawlowsky-Glahn and Egozcue (2001, 2002) and Egozcue and others (2003). (...)
Resumo:
Descriu i analitza el procés d'integració europea des dels seus inicis, els anys cinquanta, fins als nostres dies. L'objectiu consisteix a oferir una visió de conjunt d'un procés que va començar modestament amb la integració dels mercats del carbó i de l'acer dels sis membres fundadors i que ha derivat en una Unió Europea de 27 estats membres i més de 500 milions d'habitants, amb polítiques comunes, una cooperació estreta en els àmbits més sensibles de la sobirania estatal i una moneda única en setze estats
Resumo:
This paper derives a model of markets with system goods and two technological standards. An established standard incurs lower unit production costs but causes a negative externality. The paper derives the conditions for policy intervention and compares the effect of direct and indirect cost-reducing subsidies in two markets with system goods in the presence of externalities. If consumers are committed to the technology by purchasing one of the components, direct subsidies are preferable. For a medium-low cost difference between technological standards and a low externality cost it is optimal to provide a direct subsidy only to the first technology adopter. As the higher the externality cost raises, the more technology adopters should be provided with direct subsidies. This effect is robust in all extensions. In the absence of consumers commitment to a technological standard indirect and direct subsidies are both desirable. In this case, the subsidy to the first adopter is lower then the subsidy to the second adopter. Moreover, for the low cost difference between technological standards and low externality cost the fi rst fi rm chooses a superior standard without policy intervention. Finally, a perfect compatibility between components based on different technological standards enhances an advantage of indirect subsidies for medium-high externality cost and cost difference between technological standards. Journal of Economic Literature Classi fication Numbers: C72, D21, D40, H23, L13, L22, L51, O25, O33, O38. Keywords: Technological standards; complementary products; externalities; cost-reducing subsidies; compatibility.
Resumo:
One of the first useful products from the human genome will be a set of predicted genes. Besides its intrinsic scientific interest, the accuracy and completeness of this data set is of considerable importance for human health and medicine. Though progress has been made on computational gene identification in terms of both methods and accuracy evaluation measures, most of the sequence sets in which the programs are tested are short genomic sequences, and there is concern that these accuracy measures may not extrapolate well to larger, more challenging data sets. Given the absence of experimentally verified large genomic data sets, we constructed a semiartificial test set comprising a number of short single-gene genomic sequences with randomly generated intergenic regions. This test set, which should still present an easier problem than real human genomic sequence, mimics the approximately 200kb long BACs being sequenced. In our experiments with these longer genomic sequences, the accuracy of GENSCAN, one of the most accurate ab initio gene prediction programs, dropped significantly, although its sensitivity remained high. Conversely, the accuracy of similarity-based programs, such as GENEWISE, PROCRUSTES, and BLASTX was not affected significantly by the presence of random intergenic sequence, but depended on the strength of the similarity to the protein homolog. As expected, the accuracy dropped if the models were built using more distant homologs, and we were able to quantitatively estimate this decline. However, the specificities of these techniques are still rather good even when the similarity is weak, which is a desirable characteristic for driving expensive follow-up experiments. Our experiments suggest that though gene prediction will improve with every new protein that is discovered and through improvements in the current set of tools, we still have a long way to go before we can decipher the precise exonic structure of every gene in the human genome using purely computational methodology.
Resumo:
The completion of the sequencing of the mouse genome promises to help predict human genes with greater accuracy. While current ab initio gene prediction programs are remarkably sensitive (i.e., they predict at least a fragment of most genes), their specificity is often low, predicting a large number of false-positive genes in the human genome. Sequence conservation at the protein level with the mouse genome can help eliminate some of those false positives. Here we describe SGP2, a gene prediction program that combines ab initio gene prediction with TBLASTX searches between two genome sequences to provide both sensitive and specific gene predictions. The accuracy of SGP2 when used to predict genes by comparing the human and mouse genomes is assessed on a number of data sets, including single-gene data sets, the highly curated human chromosome 22 predictions, and entire genome predictions from ENSEMBL. Results indicate that SGP2 outperforms purely ab initio gene prediction methods. Results also indicate that SGP2 works about as well with 3x shotgun data as it does with fully assembled genomes. SGP2 provides a high enough specificity that its predictions can be experimentally verified at a reasonable cost. SGP2 was used to generate a complete set of gene predictions on both the human and mouse by comparing the genomes of these two species. Our results suggest that another few thousand human and mouse genes currently not in ENSEMBL are worth verifying experimentally.