227 resultados para Wigner-Brillouin perturbation theory
Resumo:
In this paper we prove a formula for the analytic index of a basic Dirac-type operator on a Riemannian foliation, solving a problem that has been open for many years. We also consider more general indices given by twisting the basic Dirac operator by a representation of the orthogonal group. The formula is a sum of integrals over blowups of the strata of the foliation and also involves eta invariants of associated elliptic operators. As a special case, a Gauss-Bonnet formula for the basic Euler characteristic is obtained using two independent proofs.
Resumo:
Descriptive set theory is mainly concerned with studying subsets of the space of all countable binary sequences. In this paper we study the generalization where countable is replaced by uncountable. We explore properties of generalized Baire and Cantor spaces, equivalence relations and their Borel reducibility. The study shows that the descriptive set theory looks very different in this generalized setting compared to the classical, countable case. We also draw the connection between the stability theoretic complexity of first-order theories and the descriptive set theoretic complexity of their isomorphism relations. Our results suggest that Borel reducibility on uncountable structures is a model theoretically natural way to compare the complexity of isomorphism relations.
Resumo:
We give the first systematic study of strong isomorphism reductions, a notion of reduction more appropriate than polynomial time reduction when, for example, comparing the computational complexity of the isomorphim problem for different classes of structures. We show that the partial ordering of its degrees is quite rich. We analyze its relationship to a further type of reduction between classes of structures based on purely comparing for every n the number of nonisomorphic structures of cardinality at most n in both classes. Furthermore, in a more general setting we address the question of the existence of a maximal element in the partial ordering of the degrees.
Resumo:
Vintage capital growth models have been at the heart of growth theory in the 60s. This research line collapsed in the late 60s with the so-called embodiment controversy and the technical sophisitication of the vintage models. This paper analyzes the astonishing revival of this literature in the 90s. In particular, it outlines three methodological breakthroughs explaining this resurgence: a growth accounting revolution, taking advantage of the availability of new time series, an optimal control revolution allowing to safely study vintage capital optimal growth models, and a vintage human capital revolution, along with the rise of economic demography, accounting for the vintage structure of human capital similarly to physical capital age structuring. The related literature is surveyed.
Resumo:
After a historical survey of temperament in Bach’s Well-Tempered Clavier by Johann Sebastian Bach, an analysis of the work has been made by applying a number of historical good temperaments as well as some recent proposals. The results obtained show that the global dissonance for all preludes and fugues in major keys can be minimized using the Kirnberger II temperament. The method of analysis used for this research is based on the mathematical theories of sensory dissonance, which have been developed by authors such as Hermann Ludwig Ferdinand von Helmholtz, Harry Partch, Reinier Plomp, Willem J. M. Levelt and William A. Sethares
Resumo:
Self-organizing maps (Kohonen 1997) is a type of artificial neural network developedto explore patterns in high-dimensional multivariate data. The conventional versionof the algorithm involves the use of Euclidean metric in the process of adaptation ofthe model vectors, thus rendering in theory a whole methodology incompatible withnon-Euclidean geometries.In this contribution we explore the two main aspects of the problem:1. Whether the conventional approach using Euclidean metric can shed valid resultswith compositional data.2. If a modification of the conventional approach replacing vectorial sum and scalarmultiplication by the canonical operators in the simplex (i.e. perturbation andpowering) can converge to an adequate solution.Preliminary tests showed that both methodologies can be used on compositional data.However, the modified version of the algorithm performs poorer than the conventionalversion, in particular, when the data is pathological. Moreover, the conventional ap-proach converges faster to a solution, when data is \well-behaved".Key words: Self Organizing Map; Artificial Neural networks; Compositional data
Resumo:
In order to explain the speed of Vesicular Stomatitis Virus VSV infections, we develop a simple model that improves previous approaches to the propagation of virus infections. For VSV infections, we find that the delay time elapsed between the adsorption of a viral particle into a cell and the release of its progeny has a veryimportant effect. Moreover, this delay time makes the adsorption rate essentially irrelevant in order to predict VSV infection speeds. Numerical simulations are in agreement with the analytical results. Our model satisfactorily explains the experimentally measured speeds of VSV infections
Resumo:
This paper shows how instructors can use the problem‐based learning method to introduce producer theory and market structure in intermediate microeconomics courses. The paper proposes a framework where different decision problems are presented to students, who are asked to imagine that they are the managers of a firm who need to solve a problem in a particular business setting. In this setting, the instructors’ role isto provide both guidance to facilitate student learning and content knowledge on a just‐in‐time basis
Resumo:
Es discuteixen breument algunes consideracions sobre l'aplicació de la Teoria delsConjunts difusos a la Química quàntica. Es demostra aqui que molts conceptes químics associats a la teoria són adequats per ésser connectats amb l'estructura dels Conjunts difusos. També s'explica com algunes descripcions teoriques dels observables quàntics espotencien tractant-les amb les eines associades als esmentats Conjunts difusos. La funciódensitat es pren com a exemple de l'ús de distribucions de possibilitat al mateix temps queles distribucions de probabilitat quàntiques
Resumo:
The biplot has proved to be a powerful descriptive and analytical tool in many areasof applications of statistics. For compositional data the necessary theoreticaladaptation has been provided, with illustrative applications, by Aitchison (1990) andAitchison and Greenacre (2002). These papers were restricted to the interpretation ofsimple compositional data sets. In many situations the problem has to be described insome form of conditional modelling. For example, in a clinical trial where interest isin how patients’ steroid metabolite compositions may change as a result of differenttreatment regimes, interest is in relating the compositions after treatment to thecompositions before treatment and the nature of the treatments applied. To study thisthrough a biplot technique requires the development of some form of conditionalcompositional biplot. This is the purpose of this paper. We choose as a motivatingapplication an analysis of the 1992 US President ial Election, where interest may be inhow the three-part composition, the percentage division among the three candidates -Bush, Clinton and Perot - of the presidential vote in each state, depends on the ethniccomposition and on the urban-rural composition of the state. The methodology ofconditional compositional biplots is first developed and a detailed interpretation of the1992 US Presidential Election provided. We use a second application involving theconditional variability of tektite mineral compositions with respect to major oxidecompositions to demonstrate some hazards of simplistic interpretation of biplots.Finally we conjecture on further possible applications of conditional compositionalbiplots
Resumo:
In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes.Against this background the relevance of perturbations and subcompositions can beclearly seen. Moreover we can identify a number of hypotheses of interest involvingthe specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses
Resumo:
We present a KAM theory for some dissipative systems (geometrically, these are conformally symplectic systems, i.e. systems that transform a symplectic form into a multiple of itself). For systems with n degrees of freedom depending on n parameters we show that it is possible to find solutions with n-dimensional (Diophantine) frequencies by adjusting the parameters. We do not assume that the system is close to integrable, but we use an a-posteriori format. Our unknowns are a parameterization of the solution and a parameter. We show that if there is a sufficiently approximate solution of the invariance equation, which also satisfies some explicit non–degeneracy conditions, then there is a true solution nearby. We present results both in Sobolev norms and in analytic norms. The a–posteriori format has several consequences: A) smooth dependence on the parameters, including the singular limit of zero dissipation; B) estimates on the measure of parameters covered by quasi–periodic solutions; C) convergence of perturbative expansions in analytic systems; D) bootstrap of regularity (i.e., that all tori which are smooth enough are analytic if the map is analytic); E) a numerically efficient criterion for the break–down of the quasi–periodic solutions. The proof is based on an iterative quadratically convergent method and on suitable estimates on the (analytical and Sobolev) norms of the approximate solution. The iterative step takes advantage of some geometric identities, which give a very useful coordinate system in the neighborhood of invariant (or approximately invariant) tori. This system of coordinates has several other uses: A) it shows that for dissipative conformally symplectic systems the quasi–periodic solutions are attractors, B) it leads to efficient algorithms, which have been implemented elsewhere. Details of the proof are given mainly for maps, but we also explain the slight modifications needed for flows and we devote the appendix to present explicit algorithms for flows.
Resumo:
HEMOLIA (a project under European community’s 7th framework programme) is a new generation Anti-Money Laundering (AML) intelligent multi-agent alert and investigation system which in addition to the traditional financial data makes extensive use of modern society’s huge telecom data source, thereby opening up a new dimension of capabilities to all Money Laundering fighters (FIUs, LEAs) and Financial Institutes (Banks, Insurance Companies, etc.). This Master-Thesis project is done at AIA, one of the partners for the HEMOLIA project in Barcelona. The objective of this thesis is to find the clusters in a network drawn by using the financial data. An extensive literature survey has been carried out and several standard algorithms related to networks have been studied and implemented. The clustering problem is a NP-hard problem and several algorithms like K-Means and Hierarchical clustering are being implemented for studying several problems relating to sociology, evolution, anthropology etc. However, these algorithms have certain drawbacks which make them very difficult to implement. The thesis suggests (a) a possible improvement to the K-Means algorithm, (b) a novel approach to the clustering problem using the Genetic Algorithms and (c) a new algorithm for finding the cluster of a node using the Genetic Algorithm.
Resumo:
The dissertation accomplishes two aims: 1) to diagnose what prevents true beliefs from being knowledge; 2) to give an positive account of knowledge. Concerning the first aim, it offers an account of the notion of luck. It defends the view that luck is a form of risk and distinguishes two types of luck. Then, it applies the account to the problem of epistemic luck and distinguishes, accordingly, two types of epistemic luck. It is argued that these two types of epistemic luck explain the whole range of cases of not-known true belief. Concerning the second aim, the dissertation advances an account of knowledge in terms of the notion of cognitive control that deals with the two forms of epistemic luck distinguished.