24 resultados para Geometry, Descriptive

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines, both descriptively and analytically, Marx's arguments for the falling rate of profit from the Hodgskin section of Theories of Surplus Value, The General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. The conclusions are as follows: First, Marx realised that his main attempt to give an intrinsic explanation of the falling rate of profit, which occurred in the General Law section, had failed; but he still hoped that he would be able to demonstrate it in the future. Second, the Hodgskin and General Law sections contain a number of subsidiary explanations, mostly related to resource scarcity, some of which are correct. Third, Part III of volume III does not contain a demonstration of the falling rate of profit, but a description of the role of the falling rate of profit in capitalist development. Forth, it also contains suppressed references to resource scarcity. And finally, in Chapter 3 of Volume III, Marx says that it is resource scarcity that causes the fall in the rate of profit described in Part III of the same volume. The key to all these conclusions in the careful analysis of the General Law section.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Marxs conclusions about the falling rate of profit have been analysed exhaustively. Usually this has been done by building models which broadly conform to Marxs views and then showing that his conclusions are either correct or, more frequently, that they can not be sustained. By contrast, this paper examines, both descriptively and analytically, Marxs arguments from the Hodgskin section of Theories of Surplus Value, the General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. It also gives a new interpretation of Part III of this last work. The main conclusions are first, that Marx had an intrinsic explanation of the falling rate of profit but was unable to give it a satisfactory demonstration and second, that he had a number of subsidiary explanations of which the most important was resource scarcity. The paper closes with an assessment of the pedigree of various currents of Marxian thought on this issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Marxs conclusions about the falling rate of profit have been analysed exhaustively. Usually this has been done by building models which broadly conform to Marxs views and then showing that his conclusions are either correct or, more frequently, that they can not be sustained. By contrast, this paper examines, both descriptively and analytically, Marxs arguments from the Hodgskin section of Theories of Surplus Value, the General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. It also gives a new interpretation of Part III of this last work. The main conclusions are first, that Marx had an intrinsic explanation of the falling rate of profit but was unable to give it a satisfactory demonstration and second, that he had a number of subsidiary explanations of which the most important was resource scarcity. The paper closes with an assessment of the pedigree of various currents of Marxian thought on this issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first main result of the paper is a criterion for a partially commutative group G to be a domain. It allows us to reduce the study of algebraic sets over G to the study of irreducible algebraic sets, and reduce the elementary theory of G (of a coordinate group over G) to the elementary theories of the direct factors of G (to the elementary theory of coordinate groups of irreducible algebraic sets). Then we establish normal forms for quantifier-free formulas over a non-abelian directly indecomposable partially commutative group H. Analogously to the case of free groups, we introduce the notion of a generalised equation and prove that the positive theory of H has quantifier elimination and that arbitrary first-order formulas lift from H to H * F, where F is a free group of finite rank. As a consequence, the positive theory of an arbitrary partially commutative group is decidable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate the role of horospheres in Integral Geometry and Differential Geometry. In particular we study envelopes of families of horocycles by means of “support maps”. We define invariant “linear combinations” of support maps or curves. Finally we obtain Gauss-Bonnet type formulas and Chern-Lashof type inequalities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuity of set-valued maps is hereby revisited: after recalling some basic concepts of variational analysis and a short description of the State-of-the-Art, we obtain as by-product two Sard type results concerning local minima of scalar and vector valued functions. Our main result though, is inscribed in the framework of tame geometry, stating that a closed-valued semialgebraic set-valued map is almost everywhere continuous (in both topological and measure-theoretic sense). The result –depending on stratification techniques– holds true in a more general setting of o-minimal (or tame) set-valued maps. Some applications are briefly discussed at the end.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel metric comparison of the appendicular skeleton (fore and hind limb) ofdifferent vertebrates using the Compositional Data Analysis (CDA) methodologicalapproach it’s presented.355 specimens belonging in various taxa of Dinosauria (Sauropodomorpha, Theropoda,Ornithischia and Aves) and Mammalia (Prothotheria, Metatheria and Eutheria) wereanalyzed with CDA.A special focus has been put on Sauropodomorpha dinosaurs and the Aitchinsondistance has been used as a measure of disparity in limb elements proportions to infersome aspects of functional morphology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densitiesby generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Demosaicking is a particular case of interpolation problems where, from a scalar image in which each pixel has either the red, the green or the blue component, we want to interpolate the full-color image. State-of-the-art demosaicking algorithms perform interpolation along edges, but these edges are estimated locally. We propose a level-set-based geometric method to estimate image edges, inspired by the image in-painting literature. This method has a time complexity of O(S) , where S is the number of pixels in the image, and compares favorably with the state-of-the-art algorithms both visually and in most relevant image quality measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expected utility theory (EUT) has been challenged as a descriptive theoryin many contexts. The medical decision analysis context is not an exception.Several researchers have suggested that rank dependent utility theory (RDUT)may accurately describe how people evaluate alternative medical treatments.Recent research in this domain has addressed a relevant feature of RDU models-probability weighting-but to date no direct test of this theoryhas been made. This paper provides a test of the main axiomatic differencebetween EUT and RDUT when health profiles are used as outcomes of riskytreatments. Overall, EU best described the data. However, evidence on theediting and cancellation operation hypothesized in Prospect Theory andCumulative Prospect Theory was apparent in our study. we found that RDUoutperformed EU in the presentation of the risky treatment pairs in whichthe common outcome was not obvious. The influence of framing effects onthe performance of RDU and their importance as a topic for future researchis discussed.