443 resultados para LORENTZIAN MANIFOLDS
Resumo:
∗ Partially supported by INTAS grant 97-1644
Resumo:
* The research has been partially supported by Bulgarian Funding Organizations, sponsoring the Algebra Section of the Mathematics Institute, Bulgarian Academy of Sciences, a Contract between the Humboldt Univestit¨at and the University of Sofia, and Grant MM 412 / 94 from the Bulgarian Board of Education and Technology
Resumo:
AMS Subj. Classification: MSC2010: 11F72, 11M36, 58J37
Resumo:
AMS Subj. Classification: 83C15, 83C35
Resumo:
MSC 2010: 30C60
Resumo:
Ива Р. Докузова, Димитър Р. Разпопов - В настоящата статия е разгледан клас V оттримерни риманови многообразия M с метрика g и два афинорни тензора q и S. Дефинирана е и друга метрика ¯g в M. Локалните координати на всички тези тензори са циркулантни матрици. Намерени са: 1) зависимост между тензора на кривина R породен от g и тензора на кривина ¯R породен от ¯g; 2) тъждество за тензора на кривина R в случая, когато тензорът на кривина ¯R се анулира; 3) зависимост между секционната кривина на прозволна двумерна q-площадка {x, qx} и скаларната кривина на M.
Resumo:
2000 Mathematics Subject Classification: 53C40, 53C25.
Resumo:
2000 Mathematics Subject Classification: 53B05, 53B99.
Resumo:
Experimental and theoretical studies regarding noise processes in various kinds of AlGaAs/GaAs heterostructures with a quantum well are reported. The measurement processes, involving a Fast Fourier Transform and analog wave analyzer in the frequency range from 10 Hz to 1 MHz, a computerized data storage and processing system, and cryostat in the temperature range from 78 K to 300 K are described in detail. The current noise spectra are obtained with the “three-point method”, using a Quan-Tech and avalanche noise source for calibration. ^ The properties of both GaAs and AlGaAs materials and field effect transistors, based on the two-dimensional electron gas in the interface quantum well, are discussed. Extensive measurements are performed in three types of heterostructures, viz., Hall structures with a large spacer layer, modulation-doped non-gated FETs, and more standard gated FETs; all structures are grown by MBE techniques. ^ The Hall structures show Lorentzian generation-recombination noise spectra with near temperature independent relaxation times. This noise is attributed to g-r processes in the 2D electron gas. For the TEGFET structures, we observe several Lorentzian g-r noise components which have strongly temperature dependent relaxation times. This noise is attributed to trapping processes in the doped AlGaAs layer. The trap level energies are determined from an Arrhenius plot of log (τT2) versus 1/T as well as from the plateau values. The theory to interpret these measurements and to extract the defect level data is reviewed and further developed. Good agreement with the data is found for all reported devices. ^
Resumo:
Electronic noise has been investigated in AlxGa1−x N/GaN Modulation-Doped Field Effect Transistors (MODFETs) of submicron dimensions, grown for us by MBE (Molecular Beam Epitaxy) techniques at Virginia Commonwealth University by Dr. H. Morkoç and coworkers. Some 20 devices were grown on a GaN substrate, four of which have leads bonded to source (S), drain (D), and gate (G) pads, respectively. Conduction takes place in the quasi-2D layer of the junction (xy plane) which is perpendicular to the quantum well (z-direction) of average triangular width ∼3 nm. A non-doped intrinsic buffer layer of ∼5 nm separates the Si-doped donors in the AlxGa1−xN layer from the 2D-transistor plane, which affords a very high electron mobility, thus enabling high-speed devices. Since all contacts (S, D, and G) must reach through the AlxGa1−xN layer to connect internally to the 2D plane, parallel conduction through this layer is a feature of all modulation-doped devices. While the shunting effect may account for no more than a few percent of the current IDS, it is responsible for most excess noise, over and above thermal noise of the device. ^ The excess noise has been analyzed as a sum of Lorentzian spectra and 1/f noise. The Lorentzian noise has been ascribed to trapping of the carriers in the AlxGa1−xN layer. A detailed, multitrapping generation-recombination noise theory is presented, which shows that an exponential relationship exists for the time constants obtained from the spectral components as a function of 1/kT. The trap depths have been obtained from Arrhenius plots of log (τT2) vs. 1000/T. Comparison with previous noise results for GaAs devices shows that: (a) many more trapping levels are present in these nitride-based devices; (b) the traps are deeper (farther below the conduction band) than for GaAs. Furthermore, the magnitude of the noise is strongly dependent on the level of depletion of the AlxGa1−xN donor layer, which can be altered by a negative or positive gate bias VGS. ^ Altogether, these frontier nitride-based devices are promising for bluish light optoelectronic devices and lasers; however, the noise, though well understood, indicates that the purity of the constituent layers should be greatly improved for future technological applications. ^
Resumo:
We are able to give a complete description of four-dimensional Lie algebras g which satisfy the tame-compatible question of Donaldson for all almost complex structures J on g are completely described. As a consequence, examples are given of (non-unimodular) four-dimensional Lie algebras with almost complex structures which are tamed but not compatible with symplectic forms.? Note that Donaldson asked his question for compact four-manifolds. In that context, the problem is still open, but it is believed that any tamed almost complex structure is in fact compatible with a symplectic form. In this presentation, I will define the basic objects involved and will give some insights on the proof. The key for the proof is translating the problem into a Linear Algebra setting. This is a joint work with Dr. Draghici.
Resumo:
This book constitutes the refereed proceedings of the 14th International Conference on Parallel Problem Solving from Nature, PPSN 2016, held in Edinburgh, UK, in September 2016. The total of 93 revised full papers were carefully reviewed and selected from 224 submissions. The meeting began with four workshops which offered an ideal opportunity to explore specific topics in intelligent transportation Workshop, landscape-aware heuristic search, natural computing in scheduling and timetabling, and advances in multi-modal optimization. PPSN XIV also included sixteen free tutorials to give us all the opportunity to learn about new aspects: gray box optimization in theory; theory of evolutionary computation; graph-based and cartesian genetic programming; theory of parallel evolutionary algorithms; promoting diversity in evolutionary optimization: why and how; evolutionary multi-objective optimization; intelligent systems for smart cities; advances on multi-modal optimization; evolutionary computation in cryptography; evolutionary robotics - a practical guide to experiment with real hardware; evolutionary algorithms and hyper-heuristics; a bridge between optimization over manifolds and evolutionary computation; implementing evolutionary algorithms in the cloud; the attainment function approach to performance evaluation in EMO; runtime analysis of evolutionary algorithms: basic introduction; meta-model assisted (evolutionary) optimization. The papers are organized in topical sections on adaption, self-adaption and parameter tuning; differential evolution and swarm intelligence; dynamic, uncertain and constrained environments; genetic programming; multi-objective, many-objective and multi-level optimization; parallel algorithms and hardware issues; real-word applications and modeling; theory; diversity and landscape analysis.
Resumo:
For any Legendrian knot in R^3 with the standard contact structure, we show that the existence of an augmentation to any field of the Chekanov-Eliashberg differential graded algebra over Z[t,t^{-1}] is equivalent to the existence of a normal ruling of the front diagram, generalizing results of Fuchs, Ishkhanov, and Sabloff. We also show that any even graded augmentation must send t to -1.
We extend the definition of a normal ruling from J^1(S^1) given by Lavrov and Rutherford to a normal ruling for Legendrian links in #^k(S^1\times S^2). We then show that for Legendrian links in J^1(S^1) and #^k(S^1\times S^2), the existence of an augmentation to any field of the Chekanov-Eliashberg differential graded algebra over Z[t,t^{-1}] is equivalent to the existence of a normal ruling of the front diagram. For Legendrian knots, we also show that any even graded augmentation must send t to -1. We use the correspondence to give nonvanishing results for the symplectic homology of certain Weinstein 4-manifolds.
Resumo:
Subspaces and manifolds are two powerful models for high dimensional signals. Subspaces model linear correlation and are a good fit to signals generated by physical systems, such as frontal images of human faces and multiple sources impinging at an antenna array. Manifolds model sources that are not linearly correlated, but where signals are determined by a small number of parameters. Examples are images of human faces under different poses or expressions, and handwritten digits with varying styles. However, there will always be some degree of model mismatch between the subspace or manifold model and the true statistics of the source. This dissertation exploits subspace and manifold models as prior information in various signal processing and machine learning tasks.
A near-low-rank Gaussian mixture model measures proximity to a union of linear or affine subspaces. This simple model can effectively capture the signal distribution when each class is near a subspace. This dissertation studies how the pairwise geometry between these subspaces affects classification performance. When model mismatch is vanishingly small, the probability of misclassification is determined by the product of the sines of the principal angles between subspaces. When the model mismatch is more significant, the probability of misclassification is determined by the sum of the squares of the sines of the principal angles. Reliability of classification is derived in terms of the distribution of signal energy across principal vectors. Larger principal angles lead to smaller classification error, motivating a linear transform that optimizes principal angles. This linear transformation, termed TRAIT, also preserves some specific features in each class, being complementary to a recently developed Low Rank Transform (LRT). Moreover, when the model mismatch is more significant, TRAIT shows superior performance compared to LRT.
The manifold model enforces a constraint on the freedom of data variation. Learning features that are robust to data variation is very important, especially when the size of the training set is small. A learning machine with large numbers of parameters, e.g., deep neural network, can well describe a very complicated data distribution. However, it is also more likely to be sensitive to small perturbations of the data, and to suffer from suffer from degraded performance when generalizing to unseen (test) data.
From the perspective of complexity of function classes, such a learning machine has a huge capacity (complexity), which tends to overfit. The manifold model provides us with a way of regularizing the learning machine, so as to reduce the generalization error, therefore mitigate overfiting. Two different overfiting-preventing approaches are proposed, one from the perspective of data variation, the other from capacity/complexity control. In the first approach, the learning machine is encouraged to make decisions that vary smoothly for data points in local neighborhoods on the manifold. In the second approach, a graph adjacency matrix is derived for the manifold, and the learned features are encouraged to be aligned with the principal components of this adjacency matrix. Experimental results on benchmark datasets are demonstrated, showing an obvious advantage of the proposed approaches when the training set is small.
Stochastic optimization makes it possible to track a slowly varying subspace underlying streaming data. By approximating local neighborhoods using affine subspaces, a slowly varying manifold can be efficiently tracked as well, even with corrupted and noisy data. The more the local neighborhoods, the better the approximation, but the higher the computational complexity. A multiscale approximation scheme is proposed, where the local approximating subspaces are organized in a tree structure. Splitting and merging of the tree nodes then allows efficient control of the number of neighbourhoods. Deviation (of each datum) from the learned model is estimated, yielding a series of statistics for anomaly detection. This framework extends the classical {\em changepoint detection} technique, which only works for one dimensional signals. Simulations and experiments highlight the robustness and efficacy of the proposed approach in detecting an abrupt change in an otherwise slowly varying low-dimensional manifold.
Resumo:
© 2016 Springer Science+Business Media DordrechtG2-Monopoles are solutions to gauge theoretical equations on G2-manifolds. If the G2-manifolds under consideration are compact, then any irreducible G2-monopole must have singularities. It is then important to understand which kind of singularities G2-monopoles can have. We give examples (in the noncompact case) of non-Abelian monopoles with Dirac type singularities, and examples of monopoles whose singularities are not of that type. We also give an existence result for Abelian monopoles with Dirac type singularities on compact manifolds. This should be one of the building blocks in a gluing construction aimed at constructing non-Abelian ones.