935 resultados para Pseudo-Riemannian geometry
Resumo:
Митрофан М. Чобан, Петър Ст. Кендеров, Уорън Б. Муурс - Полу-топологична група (съответно, топологична група) е група, снабдена с топология, относно която груповата оперция произведение е частично непрекъсната по всяка от променливите (съответно, непрекъсната по съвкупност от променливите и обратната операция е също непрекъсната). В настоящата работа ние даваме условия, от топологичен характер, една полу-топологична група да е всъщност топологична група. Например, ние показваме, че всяка сепарабелна псевдокомпактна полу-топологична група е топологична група. Показваме също, че всяка локално псевдокомпактна полу-топологична група, чиято групова операция е непрекъсната по съвкупност от променливите е топологична група.
Resumo:
2002 Mathematics Subject Classification: 35S05, 47G30, 58J42.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2014
Resumo:
2000 Mathematics Subject Classification: 35C15, 35D05, 35D10, 35S10, 35S99.
Resumo:
2000 Mathematics Subject Classification: 53C40, 53B25.
Resumo:
Pavement analysis and design for fatigue cracking involves a number of practical problems like material assessment/screening and performance prediction. A mechanics-aided method can answer these questions with satisfactory accuracy in a convenient way when it is appropriately implemented. This paper presents two techniques to implement the pseudo J-integral based Paris’ law to evaluate and predict fatigue cracking in asphalt mixtures and pavements. The first technique, quasi-elastic simulation, provides a rational and appropriate reference modulus for the pseudo analysis (i.e., viscoelastic to elastic conversion) by making use of the widely used material property: dynamic modulus. The physical significance of the quasi-elastic simulation is clarified. Introduction of this technique facilitates the implementation of the fracture mechanics models as well as continuum damage mechanics models to characterize fatigue cracking in asphalt pavements. The second technique about modeling fracture coefficients of the pseudo J-integral based Paris’ law simplifies the prediction of fatigue cracking without performing fatigue tests. The developed prediction models for the fracture coefficients rely on readily available mixture design properties that directly affect the fatigue performance, including the relaxation modulus, air void content, asphalt binder content, and aggregate gradation. Sufficient data are collected to develop such prediction models and the R2 values are around 0.9. The presented case studies serve as examples to illustrate how the pseudo J-integral based Paris’ law predicts fatigue resistance of asphalt mixtures and assesses fatigue performance of asphalt pavements. Future applications include the estimation of fatigue life of asphalt mixtures/pavements through a distinct criterion that defines fatigue failure by its physical significance.
Resumo:
In SNAP (Surface nanoscale axial photonics) resonators propagation of a slow whispering gallery mode along an optical fiber is controlled by nanoscale variation of the effective radius of the fiber [1]. Similar behavior can be realized in so - called nanobump microresonators in which the introduced variation of the effective radius is asymmetric, i.e. depends on the axial coordinate [2]. The possibilities of realization of such structures “on the fly” in an optical fiber by applying external electrostatic fields to it is discussed in this work. It is shown that local variations in effective radius of the fiber and in its refractive index caused by external electric fields can be large enough to observe SNAP structure - like behavior in an originally flat optical fiber. Theoretical estimations of the introduced refractive index and effective radius changes and results of finite element calculations are presented. Various effects are taken into account: electromechanical (piezoelectricity and electrostriction), electro-optical (Pockels and Kerr effects) and elasto-optical effect. Different initial fibre cross-sections are studied. The aspects of use of linear isotropic (such as silica) and non-linear anisotropic (such as lithium niobate) materials of the fiber are discussed. REFERENCES [1] M. Sumetsky, J. M. Fini, Opt. Exp. 19, 26470 (2011). [2] L. A. Kochkurov, M. Sumetsky, Opt. Lett. 40, 1430 (2015).
Resumo:
Tanulmányunkban egy olyan rendszerrel foglalkozunk, amely köztes helyet foglal el a demokrácia és az autokrácia között, mindkettő jegyeit magán viseli, és amelyet ezért áldemokráciának nevezünk. A rendszer működési sajátosságait a járadékok szemszögéből vizsgáljuk, és arra keressük a választ, hogy demokratikus országokban hogyan képes egy párt tartósan domináns pozícióban maradni. Modellünk segítségével összekapcsoljuk a járadékteremtést a szavazatok maximalizálásának céljával, és bemutatjuk, miért jelenthet racionális döntést a hatalmon lévők számára a rövid távú optimumon túlmutató járadékteremtés is. A modell rávilágít arra, hogy a többletjáradékok segítségével a kormányzat - klientúrája megerősítése, a demokratikus rendszer határainak feszegetése, valamint az ellenzék visszaszorítása révén - hosszú távú előnyökre tehet szert, áldemokráciát hozva létre. Történelmi példák jól mutatják, hogy a rendszer bukását végül általában a gyengébb gazdasági teljesítmény és a korrupció széles körűvé válása kényszeríti ki. _____ The paper focuses on a specific political system lying between democracy and autocracy, which has similarities to both. Called here a pseudo-democracy, it is examined from the point of view of rents. The paper enquires how a democracy can allow a single party to dominate the political landscape for a long period. The author constructs a model to link rent creation to vote maximization, arguing that it can be rational for incumbents to increase rents beyond the short-term optimum. The model also reveals that surplus rents may offer long-term gains to an elite by strengthening its clientčle, challenging the systemic political framework and holding back the opposition. Historical examples show that the end-result of a pseudo-democratic system will usually be to weaken economic performance and increase corruption.
Resumo:
The 9/11 Act mandates the inspection of 100% of cargo shipments entering the U.S. by 2012 and 100% inspection of air cargo by March 2010. So far, only 5% of inbound shipping containers are inspected thoroughly while air cargo inspections have fared better at 50%. Government officials have admitted that these milestones cannot be met since the appropriate technology does not exist. This research presents a novel planar solid phase microextraction (PSPME) device with enhanced surface area and capacity for collection of the volatile chemical signatures in air that are emitted from illicit compounds for direct introduction into ion mobility spectrometers (IMS) for detection. These IMS detectors are widely used to detect particles of illicit substances and do not have to be adapted specifically to this technology. For static extractions, PDMS and sol-gel PDMS PSPME devices provide significant increases in sensitivity over conventional fiber SPME. Results show a 50–400 times increase in mass detected of piperonal and a 2–4 times increase for TNT. In a blind study of 6 cases suspected to contain varying amounts of MDMA, PSPME-IMS correctly detected 5 positive cases with no false positives or negatives. One of these cases had minimal amounts of MDMA resulting in a false negative response for fiber SPME-IMS. A La (dihed) phase chemistry has shown an increase in the extraction efficiency of TNT and 2,4-DNT and enhanced retention over time. An alternative PSPME device was also developed for the rapid (seconds) dynamic sampling and preconcentration of large volumes of air for direct thermal desorption into an IMS. This device affords high extraction efficiencies due to strong retention properties under ambient conditions resulting in ppt detection limits when 3.5 L of air are sampled over the course of 10 seconds. Dynamic PSPME was used to sample the headspace over the following: MDMA tablets (12–40 ng detected of piperonal), high explosives (Pentolite) (0.6 ng detected of TNT), and several smokeless powders (26–35 ng of 2,4-DNT and 11–74 ng DPA detected). PSPME-IMS technology is flexible to end-user needs, is low-cost, rapid, sensitive, easy to use, easy to implement, and effective. ^
Resumo:
Structural Health Monitoring (SHM) systems were developed to evaluate the integrity of a system during operation, and to quickly identify the maintenance problems. They will be used in future aerospace vehicles to improve safety, reduce cost and minimize the maintenance time of a system. Many SHM systems were already developed to evaluate the integrity of plates and used in marine structures. Their implementation in manufacturing processes is still expected. The application of SHM methods for complex geometries and welds are two important challenges in this area of research. This research work started by studying the characteristics of piezoelectric actuators, and a small energy harvester was designed. The output voltages at different frequencies of vibration were acquired to determine the nonlinear characteristics of the piezoelectric stripe actuators. The frequency response was evaluated experimentally. AA battery size energy harvesting devices were developed by using these actuators. When the round and square cross section devices were excited at 50 Hz frequency, they generated 16 V and 25 V respectively. The Surface Response to Excitation (SuRE) and Lamb wave methods were used to estimate the condition of parts with complex geometries. Cutting tools and welded plates were considered. Both approaches used piezoelectric elements that were attached to the surfaces of considered parts. The variation of the magnitude of the frequency response was evaluated when the SuRE method was used. The sum of the square of the differences was calculated. The envelope of the received signal was used for the analysis of wave propagation. Bi-orthogonal wavelet (Binlet) analysis was also used for the evaluation of the data obtained during Lamb wave technique. Both the Lamb wave and SuRE approaches along with the three methods for data analysis worked effectively to detect increasing tool wear. Similarly, they detected defects on the plate, on the weld, and on a separate plate without any sensor as long as it was welded to the test plate.
Resumo:
The presences of heavy metals, organic contaminants and natural toxins in natural water bodies pose a serious threat to the environment and the health of living organisms. Therefore, there is a critical need to identify sustainable and environmentally friendly water treatment processes. In this dissertation, I focus on the fundamental studies of advanced oxidation processes and magnetic nano-materials as promising new technologies for water treatments. Advanced oxidation processes employ reactive oxygen species (ROS) which can lead to the mineralization of a number of pollutants and toxins. The rates of formation, steady-state concentrations, and kinetic parameters of hydroxyl radical and singlet oxygen produced by various TiO2 photocatalysts under UV or visible irradiations were measured using selective chemical probes. Hydroxyl radical is the dominant ROS, and its generation is dependent on experimental conditions. The optimal condition for generation of hydroxyl radical by of TiO2 coated glass microspheres is studied by response surface methodology, and the optimal conditions are applied for the degradation of dimethyl phthalate. Singlet oxygen (1O2) also plays an important role for advanced processes, so the degradation of microcystin-LR by rose bengal, an 1O2 sensitizer was studied. The measured bimolecular reaction rate constant between MC-LR and 1O2 is ∼ 106 M-1 s-1 based on competition kinetics with furfuryl alcohol. The typical adsorbent needs separation after the treatment, while magnetic iron oxides can be easily removed by a magnetic field. Maghemite and humic acid coated magnetite (HA-Fe3O4) were synthesized, characterized and applied for chromium(VI) removal. The adsorption of chromium(VI) by maghemite and HA-Fe3O4 follow a pseudo-second-order kinetic process. The adsorption of chromium(VI) by maghemite is accurately modeled using adsorption isotherms, and solution pH and presence of humic acid influence adsorption. Humic acid coated magnetite can adsorb and reduce chromium(VI) to non-toxic chromium (III), and the reaction is not highly dependent on solution pH. The functional groups associated with humic acid act as ligands lead to the Cr(III) complex via a coupled reduction-complexation mechanism. Extended X-ray absorption fine structure spectroscopy demonstrates the Cr(III) in the Cr-loaded HA-Fe 3O4 materials has six neighboring oxygen atoms in an octahedral geometry with average bond lengths of 1.98 Å.
Resumo:
The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Subspaces and manifolds are two powerful models for high dimensional signals. Subspaces model linear correlation and are a good fit to signals generated by physical systems, such as frontal images of human faces and multiple sources impinging at an antenna array. Manifolds model sources that are not linearly correlated, but where signals are determined by a small number of parameters. Examples are images of human faces under different poses or expressions, and handwritten digits with varying styles. However, there will always be some degree of model mismatch between the subspace or manifold model and the true statistics of the source. This dissertation exploits subspace and manifold models as prior information in various signal processing and machine learning tasks.
A near-low-rank Gaussian mixture model measures proximity to a union of linear or affine subspaces. This simple model can effectively capture the signal distribution when each class is near a subspace. This dissertation studies how the pairwise geometry between these subspaces affects classification performance. When model mismatch is vanishingly small, the probability of misclassification is determined by the product of the sines of the principal angles between subspaces. When the model mismatch is more significant, the probability of misclassification is determined by the sum of the squares of the sines of the principal angles. Reliability of classification is derived in terms of the distribution of signal energy across principal vectors. Larger principal angles lead to smaller classification error, motivating a linear transform that optimizes principal angles. This linear transformation, termed TRAIT, also preserves some specific features in each class, being complementary to a recently developed Low Rank Transform (LRT). Moreover, when the model mismatch is more significant, TRAIT shows superior performance compared to LRT.
The manifold model enforces a constraint on the freedom of data variation. Learning features that are robust to data variation is very important, especially when the size of the training set is small. A learning machine with large numbers of parameters, e.g., deep neural network, can well describe a very complicated data distribution. However, it is also more likely to be sensitive to small perturbations of the data, and to suffer from suffer from degraded performance when generalizing to unseen (test) data.
From the perspective of complexity of function classes, such a learning machine has a huge capacity (complexity), which tends to overfit. The manifold model provides us with a way of regularizing the learning machine, so as to reduce the generalization error, therefore mitigate overfiting. Two different overfiting-preventing approaches are proposed, one from the perspective of data variation, the other from capacity/complexity control. In the first approach, the learning machine is encouraged to make decisions that vary smoothly for data points in local neighborhoods on the manifold. In the second approach, a graph adjacency matrix is derived for the manifold, and the learned features are encouraged to be aligned with the principal components of this adjacency matrix. Experimental results on benchmark datasets are demonstrated, showing an obvious advantage of the proposed approaches when the training set is small.
Stochastic optimization makes it possible to track a slowly varying subspace underlying streaming data. By approximating local neighborhoods using affine subspaces, a slowly varying manifold can be efficiently tracked as well, even with corrupted and noisy data. The more the local neighborhoods, the better the approximation, but the higher the computational complexity. A multiscale approximation scheme is proposed, where the local approximating subspaces are organized in a tree structure. Splitting and merging of the tree nodes then allows efficient control of the number of neighbourhoods. Deviation (of each datum) from the learned model is estimated, yielding a series of statistics for anomaly detection. This framework extends the classical {\em changepoint detection} technique, which only works for one dimensional signals. Simulations and experiments highlight the robustness and efficacy of the proposed approach in detecting an abrupt change in an otherwise slowly varying low-dimensional manifold.
Resumo:
This work explores the use of statistical methods in describing and estimating camera poses, as well as the information feedback loop between camera pose and object detection. Surging development in robotics and computer vision has pushed the need for algorithms that infer, understand, and utilize information about the position and orientation of the sensor platforms when observing and/or interacting with their environment.
The first contribution of this thesis is the development of a set of statistical tools for representing and estimating the uncertainty in object poses. A distribution for representing the joint uncertainty over multiple object positions and orientations is described, called the mirrored normal-Bingham distribution. This distribution generalizes both the normal distribution in Euclidean space, and the Bingham distribution on the unit hypersphere. It is shown to inherit many of the convenient properties of these special cases: it is the maximum-entropy distribution with fixed second moment, and there is a generalized Laplace approximation whose result is the mirrored normal-Bingham distribution. This distribution and approximation method are demonstrated by deriving the analytical approximation to the wrapped-normal distribution. Further, it is shown how these tools can be used to represent the uncertainty in the result of a bundle adjustment problem.
Another application of these methods is illustrated as part of a novel camera pose estimation algorithm based on object detections. The autocalibration task is formulated as a bundle adjustment problem using prior distributions over the 3D points to enforce the objects' structure and their relationship with the scene geometry. This framework is very flexible and enables the use of off-the-shelf computational tools to solve specialized autocalibration problems. Its performance is evaluated using a pedestrian detector to provide head and foot location observations, and it proves much faster and potentially more accurate than existing methods.
Finally, the information feedback loop between object detection and camera pose estimation is closed by utilizing camera pose information to improve object detection in scenarios with significant perspective warping. Methods are presented that allow the inverse perspective mapping traditionally applied to images to be applied instead to features computed from those images. For the special case of HOG-like features, which are used by many modern object detection systems, these methods are shown to provide substantial performance benefits over unadapted detectors while achieving real-time frame rates, orders of magnitude faster than comparable image warping methods.
The statistical tools and algorithms presented here are especially promising for mobile cameras, providing the ability to autocalibrate and adapt to the camera pose in real time. In addition, these methods have wide-ranging potential applications in diverse areas of computer vision, robotics, and imaging.