952 resultados para Analytic number theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marxs conclusions about the falling rate of profit have been analysed exhaustively. Usually this has been done by building models which broadly conform to Marxs views and then showing that his conclusions are either correct or, more frequently, that they can not be sustained. By contrast, this paper examines, both descriptively and analytically, Marxs arguments from the Hodgskin section of Theories of Surplus Value, the General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. It also gives a new interpretation of Part III of this last work. The main conclusions are first, that Marx had an intrinsic explanation of the falling rate of profit but was unable to give it a satisfactory demonstration and second, that he had a number of subsidiary explanations of which the most important was resource scarcity. The paper closes with an assessment of the pedigree of various currents of Marxian thought on this issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Marxs conclusions about the falling rate of profit have been analysed exhaustively. Usually this has been done by building models which broadly conform to Marxs views and then showing that his conclusions are either correct or, more frequently, that they can not be sustained. By contrast, this paper examines, both descriptively and analytically, Marxs arguments from the Hodgskin section of Theories of Surplus Value, the General Law section of the recently published Volume 33 of the Collected Works and Chapter 3 of Volume III of Capital. It also gives a new interpretation of Part III of this last work. The main conclusions are first, that Marx had an intrinsic explanation of the falling rate of profit but was unable to give it a satisfactory demonstration and second, that he had a number of subsidiary explanations of which the most important was resource scarcity. The paper closes with an assessment of the pedigree of various currents of Marxian thought on this issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing literature integrates theories of debt management into models of optimal fiscal policy. One promising theory argues that the composition of government debt should be chosen so that fluctuations in the market value of debt offset changes in expected future deficits. This complete market approach to debt management is valid even when the government only issues non-contingent bonds. A number of authors conclude from this approach that governments should issue long term debt and invest in short term assets. We argue that the conclusions of this approach are too fragile to serve as a basis for policy recommendations. This is because bonds at different maturities have highly correlated returns, causing the determination of the optimal portfolio to be ill-conditioned. To make this point concrete we examine the implications of this approach to debt management in various models, both analytically and using numerical methods calibrated to the US economy. We find the complete market approach recommends asset positions which are huge multiples of GDP. Introducing persistent shocks or capital accumulation only worsens this problem. Increasing the volatility of interest rates through habits partly reduces the size of these simulations we find no presumption that governments should issue long term debt ? policy recommendations can be easily reversed through small perturbations in the specification of shocks or small variations in the maturity of bonds issued. We further extend the literature by removing the assumption that governments every period costlessly repurchase all outstanding debt. This exacerbates the size of the required positions, worsens their volatility and in some cases produces instability in debt holdings. We conclude that it is very difficult to insulate fiscal policy from shocks by using the complete markets approach to debt management. Given the limited variability of the yield curve using maturities is a poor way to substitute for state contingent debt. The result is the positions recommended by this approach conflict with a number of features that we believe are important in making bond markets incomplete e.g allowing for transaction costs, liquidity effects, etc.. Until these features are all fully incorporated we remain in search of a theory of debt management capable of providing robust policy insights.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent theoretical developments and case study evidence suggests a relationship between the military in politics and corruption. This study contributes to this literature by analyzing theoretically and empirically the role of the military in politics and corruption for the first time. By drawing on a cross sectional and panel data set covering a large number of countries, over the period 1984-2007, and using a variety of econometric methods substantial empirical support is found for a positive relationship between the military in politics and corruption. In sum, our results reveal that a one standard deviation increase in the military in politics leads to a 0.22 unit increase in corruption index. This relationship is shown to be robust to a variety of specification changes, different econometric techniques, different sample sizes, alternative corruption indices and the exclusion of outliers. This study suggests that the explanatory power of the military in politics is at least as important as the conventionally accepted causes of corruption, such as economic development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adverse selection may thwart trade between an informed seller, who knows the probability p that an item of antiquity is genuine, and an uninformed buyer, who does not know p. The buyer might not be wholly uninformed, however. Suppose he can perform a simple inspection, a test of his own: the probability that an item passes the test is g if the item is genuine, but only f < g if it is fake. Given that the buyer is no expert, his test may have little power: f may be close to g. Unfortunately, without much power, the buyer's test will not resolve the difficulty of adverse selection; gains from trade may remain unexploited. But now consider a "store", where the seller groups a number of items, perhaps all with the same quality, the same probability p of being genuine. (We show that in equilibrium the seller will choose to group items in this manner.) Now the buyer can conduct his test across a large sample, perhaps all, of a group of items in the seller's store. He can thereby assess the overall quality of these items; he can invert the aggregate of his test results to uncover the underlying p; he can form a "prior". There is thus no longer asymmetric information between seller and buyer: gains from trade can be exploited. This is our theory of retailing: by grouping items together - setting up a store - a seller is able to supply buyers with priors, as well as the items themselves. We show that the weaker the power of the buyer�s test (the closer f is to g), the greater the seller�s profit. So the seller has no incentive to assist the buyer � e.g., by performing her own tests on the items, or by cleaning them to reveal more about their true age. The paper ends with an analysis of which sellers should specialise in which qualities. We show that quality will be low in busy locations and high in expensive locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Arbuscular mycorrhizal fungi (AMF) are ancient asexually reproducing organisms that form symbioses with the majority of plant species, improving plant nutrition and promoting plant diversity. Little is known about the evolution or organization of the genomes of any eukaryotic symbiont or ancient asexual organism. Direct evidence shows that one AMF species is heterokaryotic; that is, containing populations of genetically different nuclei. It has been suggested, however, that the genetic variation passed from generation to generation in AMF is simply due to multiple chromosome sets (that is, high ploidy). Here we show that previously documented genetic variation in Pol-like sequences, which are passed from generation to generation, cannot be due to either high ploidy or repeated gene duplications. Our results provide the clearest evidence so far for substantial genetic differences among nuclei in AMF. We also show that even AMF with a very large nuclear DNA content are haploid. An underlying principle of evolutionary theory is that an individual passes on one or half of its genome to each of its progeny. The coexistence of a population of many genomes in AMF and their transfer to subsequent generations, therefore, has far-reaching consequences for understanding genome evolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we present a new approach of Nekhoroshev theory for a generic unperturbed Hamiltonian which completely avoids small divisors problems. The proof is an extension of a method introduced by P. Lochak which combines averaging along periodic orbits with simultaneous Diophantine approximation and uses geometric arguments designed by the second author to handle generic integrable Hamiltonians. This method allows to deal with generic non-analytic Hamiltonians and to obtain new results of generic stability around linearly stable tori.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing number of studies have been addressing the relationship between theory of mind (TOM) and executive functions (EF) in patients with acquired neurological pathology. In order to provide a global overview on the main findings, we conducted a systematic review on group studies where we aimed to (1) evaluate the patterns of impaired and preserved abilities of both TOM and EF in groups of patients with acquired neurological pathology and (2) investigate the existence of particular relations between different EF domains and TOM tasks. The search was conducted in Pubmed/Medline. A total of 24 articles met the inclusion criteria. We considered for analysis classical clinically accepted TOM tasks (first- and second-order false belief stories, the Faux Pas test, Happe's stories, the Mind in the Eyes task, and Cartoon's tasks) and EF domains (updating, shifting, inhibition, and access). The review suggests that (1) EF and TOM appear tightly associated. However, the few dissociations observed suggest they cannot be reduced to a single function; (2) no executive subprocess could be specifically associated with TOM performances; (3) the first-order false belief task and the Happe's story task seem to be less sensitive to neurological pathologies and less associated to EF. Even though the analysis of the reviewed studies demonstrates a close relationship between TOM and EF in patients with acquired neurological pathology, the nature of this relationship must be further investigated. Studies investigating ecological consequences of TOM and EF deficits, and intervention researches may bring further contributions to this question.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we prove a formula for the analytic index of a basic Dirac-type operator on a Riemannian foliation, solving a problem that has been open for many years. We also consider more general indices given by twisting the basic Dirac operator by a representation of the orthogonal group. The formula is a sum of integrals over blowups of the strata of the foliation and also involves eta invariants of associated elliptic operators. As a special case, a Gauss-Bonnet formula for the basic Euler characteristic is obtained using two independent proofs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After a historical survey of temperament in Bach’s Well-Tempered Clavier by Johann Sebastian Bach, an analysis of the work has been made by applying a number of historical good temperaments as well as some recent proposals. The results obtained show that the global dissonance for all preludes and fugues in major keys can be minimized using the Kirnberger II temperament. The method of analysis used for this research is based on the mathematical theories of sensory dissonance, which have been developed by authors such as Hermann Ludwig Ferdinand von Helmholtz, Harry Partch, Reinier Plomp, Willem J. M. Levelt and William A. Sethares

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a KAM theory for some dissipative systems (geometrically, these are conformally symplectic systems, i.e. systems that transform a symplectic form into a multiple of itself). For systems with n degrees of freedom depending on n parameters we show that it is possible to find solutions with n-dimensional (Diophantine) frequencies by adjusting the parameters. We do not assume that the system is close to integrable, but we use an a-posteriori format. Our unknowns are a parameterization of the solution and a parameter. We show that if there is a sufficiently approximate solution of the invariance equation, which also satisfies some explicit non–degeneracy conditions, then there is a true solution nearby. We present results both in Sobolev norms and in analytic norms. The a–posteriori format has several consequences: A) smooth dependence on the parameters, including the singular limit of zero dissipation; B) estimates on the measure of parameters covered by quasi–periodic solutions; C) convergence of perturbative expansions in analytic systems; D) bootstrap of regularity (i.e., that all tori which are smooth enough are analytic if the map is analytic); E) a numerically efficient criterion for the break–down of the quasi–periodic solutions. The proof is based on an iterative quadratically convergent method and on suitable estimates on the (analytical and Sobolev) norms of the approximate solution. The iterative step takes advantage of some geometric identities, which give a very useful coordinate system in the neighborhood of invariant (or approximately invariant) tori. This system of coordinates has several other uses: A) it shows that for dissipative conformally symplectic systems the quasi–periodic solutions are attractors, B) it leads to efficient algorithms, which have been implemented elsewhere. Details of the proof are given mainly for maps, but we also explain the slight modifications needed for flows and we devote the appendix to present explicit algorithms for flows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In multiuser detection, the set of users active at any time may be unknown to the receiver. In these conditions, optimum reception consists of detecting simultaneously the set of activeusers and their data, problem that can be solved exactly by applying random-set theory (RST) and Bayesian recursions (BR). However, implementation of optimum receivers may be limited by their complexity, which grows exponentially with the number of potential users. In this paper we examine three strategies leading to reduced-complexity receivers.In particular, we show how a simple approximation of BRs enables the use of Sphere Detection (SD) algorithm, whichexhibits satisfactory performance with limited complexity.