965 resultados para Pseudo-pelger-huet Anomaly
Resumo:
Peer reviewed
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
I discuss geometry and normal forms for pseudo-Riemannian metrics with parallel spinor fields in some interesting dimensions. I also discuss the interaction of these conditions for parallel spinor fields with the condition that the Ricci tensor vanish (which, for pseudo-Riemannian manifolds, is not an automatic consequence of the existence of a nontrivial parallel spinor field).
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
"Para describir la vida política de las democracias antiguas nos vemos forzados a servirnos de términos que utilizamos para las democracias modernas. Pero hay que tener cuidado de no engañarse. Pues las mismas palabras contemplan a menudo las realidades más diversas" (Reverdin, 1945: 201). Esta afirmación nos autoriza a preguntarnos si hay o no una conexión entre los demagogou atenienses y lo que nosotros entendemos por "demagogos". El lexema ingresó a los idiomas modernos por una traducción al francés de una traducción latina de la Política de Aristóteles en el siglo XIV d.C. (Robert, 1954: s.v.), donde ya era patente el carácter peyorativo del lexema. Durante los últimos cincuenta años, los especialistas han analizado las ocurrencias del campo léxico de la demagogia desde una perspectiva histórico- sociológica y en ese sentido coinciden en que estos vocablos en su origen eran "neutros" (Connor, 1992; Lane, 2012). Ostwald, por su parte, opina que la connotación peyorativa del término probablemente se deba a Aristóteles, específicamente a un pasaje de la Política (4.4.1292a4-38) (1986: 201). Sin embargo, contamos con testimonios del lexema de fines del siglo V a.C. y principios del IV a.C. donde es posible entender cierta connotación negativa. Dentro del conjunto de testimonios se encuentra el Contra Alcibíades del Pseudo Andócides. El presente trabajo, complementario de análisis anteriores en el corpus del siglo V a.C., tiene como objetivo analizar el mencionado lexema en su cotexto a la luz de una perspectiva que concibe las palabras como piezas lingüísticas susceptibles de variación semántica incluso en un mismo estadio sincrónico, aplicando conceptos de las teorías de la Nueva Retórica y del Análisis del Discurso
Resumo:
Gravity surveying is challenging in Antarctica because of its hostile environment and inaccessibility. Nevertheless, many ground-based, airborne and shipborne gravity campaigns have been completed by the geophysical and geodetic communities since the 1980s. We present the first modern Antarctic-wide gravity data compilation derived from 13 million data points covering an area of 10 million km**2, which corresponds to 73% coverage of the continent. The remove-compute-restore technique was applied for gridding, which facilitated levelling of the different gravity datasets with respect to an Earth Gravity Model derived from satellite data alone. The resulting free-air and Bouguer gravity anomaly grids of 10 km resolution are publicly available. These grids will enable new high-resolution combined Earth Gravity Models to be derived and represent a major step forward towards solving the geodetic polar data gap problem. They provide a new tool to investigate continental-scale lithospheric structure and geological evolution of Antarctica.
Resumo:
The deep sea sedimentary record is an archive of the pre-glacial to glacial development of Antarctica and changes in climate, tectonics and ocean circulation. Identification of the pre-glacial, transitional and full glacial components in the sedimentary record is necessary for ice sheet reconstruction and to build circum-Antarctic sediment thickness grids for past topography and bathymetry reconstructions, which constrain paleoclimate models. A ~3300 km long Weddell Sea to Scotia Sea transect consisting of multichannel seismic reflection data from various organisations, were used to interpret new horizons to define the initial basin-wide seismostratigraphy and to identify the pre-glacial to glacial components. We mapped seven main units of which three are in the inferred Cretaceous-Paleocene pre-glacial regime, one in the Eocene-Oligocene transitional regime and three units in the Miocene-Pleistocene full glacial climate regime. Sparse borehole data from ODP leg 113 and SHALDRIL constrain the ages of the upper three units. Compiled seafloor spreading magnetic anomalies constrain the basement ages and the hypothetical age model. In many cases, the new horizons and stratigraphy contradict the interpretations in local studies. Each seismic sedimentary unit and its associated base horizon are continuous and traceable for the entire transect length, but reflect a lateral change in age whilst representing the same deposition process. The up to 1240 m thick pre-glacial seismic units form a mound in the central Weddell Sea basin and, in conjunction with the eroded flank geometry, support the interpretation of a Cretaceous proto-Weddell Gyre. The base reflector of the transitional seismic unit, which marks the initial ice sheet advances to the outer shelf, has a lateral model age of 26.6-15.5 Ma from southeast to northwest. The Pliocene-Pleistocene glacial deposits reveals lower sedimentations rates, indicating a reduced sediment supply. Sedimentation rates for the pre-glacial, transitional and full glacial components are highest around the Antarctic Peninsula, indicating higher erosion and sediment supply of a younger basement. We interpret an Eocene East Antarctic Ice Sheet expansion, Oligocene grounding of the West Antarctic Ice Sheet and Early Miocene grounding of the Antarctic Peninsula Ice Sheet.
Resumo:
Tras mi reciente edición de los pseudo-aristotélicos Pepli Epitaphia, el presente trabajo se centra en los apochrypha a dichos epitafios que compuso Juan Tzetzes en el siglo xii, un conjunto de ocho dísticos elegíacos para los héroes que consideró meritorios de tal tarea, y para quienes no pudo encontrar un epitafio conservado en las fuentes manuscritas a las que tuvo acceso. Para lograr dicho propósito, también se investiga el grado de conocimiento y la transmisión de ese corpus epigramático en la literatura bizantina, además de considerar las lecciones y el sentido mismo de dos códices guardados en la Biblioteca Nacional de España (M y Md). En ellos, Constantino Láscaris copió, directamente a partir de Tzetzes, dos breves antologías de dichos componentes.
Resumo:
FPGAs and GPUs are often used when real-time performance in video processing is required. An accelerated processor is chosen based on task-specific priorities (power consumption, processing time and detection accuracy), and this decision is normally made once at design time. All three characteristics are important, particularly in battery-powered systems. Here we propose a method for moving selection of processing platform from a single design-time choice to a continuous run time one.We implement Histogram of Oriented Gradients (HOG) detectors for cars and people and Mixture of Gaussians (MoG) motion detectors running across FPGA, GPU and CPU in a heterogeneous system. We use this to detect illegally parked vehicles in urban scenes. Power, time and accuracy information for each detector is characterised. An anomaly measure is assigned to each detected object based on its trajectory and location, when compared to learned contextual movement patterns. This drives processor and implementation selection, so that scenes with high behavioural anomalies are processed with faster but more power hungry implementations, but routine or static time periods are processed with power-optimised, less accurate, slower versions. Real-time performance is evaluated on video datasets including i-LIDS. Compared to power-optimised static selection, automatic dynamic implementation mapping is 10% more accurate but draws 12W extra power in our testbed desktop system.
Resumo:
This work addresses the problem of detecting human behavioural anomalies in crowded surveillance environments. We focus in particular on the problem of detecting subtle anomalies in a behaviourally heterogeneous surveillance scene. To reach this goal we implement a novel unsupervised context-aware process. We propose and evaluate a method of utilising social context and scene context to improve behaviour analysis. We find that in a crowded scene the application of Mutual Information based social context permits the ability to prevent self-justifying groups and propagate anomalies in a social network, granting a greater anomaly detection capability. Scene context uniformly improves the detection of anomalies in both datasets. The strength of our contextual features is demonstrated by the detection of subtly abnormal behaviours, which otherwise remain indistinguishable from normal behaviour.
Resumo:
OBJECTIVES: The aim of this study was to describe the epidemiology of Ebstein's anomaly in Europe and its association with maternal health and medication exposure during pregnancy.
DESIGN: We carried out a descriptive epidemiological analysis of population-based data.
SETTING: We included data from 15 European Surveillance of Congenital Anomalies Congenital Anomaly Registries in 12 European countries, with a population of 5.6 million births during 1982-2011. Participants Cases included live births, fetal deaths from 20 weeks gestation, and terminations of pregnancy for fetal anomaly. Main outcome measures We estimated total prevalence per 10,000 births. Odds ratios for exposure to maternal illnesses/medications in the first trimester of pregnancy were calculated by comparing Ebstein's anomaly cases with cardiac and non-cardiac malformed controls, excluding cases with genetic syndromes and adjusting for time period and country.
RESULTS: In total, 264 Ebstein's anomaly cases were recorded; 81% were live births, 2% of which were diagnosed after the 1st year of life; 54% of cases with Ebstein's anomaly or a co-existing congenital anomaly were prenatally diagnosed. Total prevalence rose over time from 0.29 (95% confidence interval (CI) 0.20-0.41) to 0.48 (95% CI 0.40-0.57) (p<0.01). In all, nine cases were exposed to maternal mental health conditions/medications (adjusted odds ratio (adjOR) 2.64, 95% CI 1.33-5.21) compared with cardiac controls. Cases were more likely to be exposed to maternal β-thalassemia (adjOR 10.5, 95% CI 3.13-35.3, n=3) and haemorrhage in early pregnancy (adjOR 1.77, 95% CI 0.93-3.38, n=11) compared with cardiac controls.
CONCLUSIONS: The increasing prevalence of Ebstein's anomaly may be related to better and earlier diagnosis. Our data suggest that Ebstein's anomaly is associated with maternal mental health problems generally rather than lithium or benzodiazepines specifically; therefore, changing or stopping medications may not be preventative. We found new associations requiring confirmation.
Resumo:
To maintain the pace of development set by Moore's law, production processes in semiconductor manufacturing are becoming more and more complex. The development of efficient and interpretable anomaly detection systems is fundamental to keeping production costs low. As the dimension of process monitoring data can become extremely high anomaly detection systems are impacted by the curse of dimensionality, hence dimensionality reduction plays an important role. Classical dimensionality reduction approaches, such as Principal Component Analysis, generally involve transformations that seek to maximize the explained variance. In datasets with several clusters of correlated variables the contributions of isolated variables to explained variance may be insignificant, with the result that they may not be included in the reduced data representation. It is then not possible to detect an anomaly if it is only reflected in such isolated variables. In this paper we present a new dimensionality reduction technique that takes account of such isolated variables and demonstrate how it can be used to build an interpretable and robust anomaly detection system for Optical Emission Spectroscopy data.
Resumo:
Finding rare events in multidimensional data is an important detection problem that has applications in many fields, such as risk estimation in insurance industry, finance, flood prediction, medical diagnosis, quality assurance, security, or safety in transportation. The occurrence of such anomalies is so infrequent that there is usually not enough training data to learn an accurate statistical model of the anomaly class. In some cases, such events may have never been observed, so the only information that is available is a set of normal samples and an assumed pairwise similarity function. Such metric may only be known up to a certain number of unspecified parameters, which would either need to be learned from training data, or fixed by a domain expert. Sometimes, the anomalous condition may be formulated algebraically, such as a measure exceeding a predefined threshold, but nuisance variables may complicate the estimation of such a measure. Change detection methods used in time series analysis are not easily extendable to the multidimensional case, where discontinuities are not localized to a single point. On the other hand, in higher dimensions, data exhibits more complex interdependencies, and there is redundancy that could be exploited to adaptively model the normal data. In the first part of this dissertation, we review the theoretical framework for anomaly detection in images and previous anomaly detection work done in the context of crack detection and detection of anomalous components in railway tracks. In the second part, we propose new anomaly detection algorithms. The fact that curvilinear discontinuities in images are sparse with respect to the frame of shearlets, allows us to pose this anomaly detection problem as basis pursuit optimization. Therefore, we pose the problem of detecting curvilinear anomalies in noisy textured images as a blind source separation problem under sparsity constraints, and propose an iterative shrinkage algorithm to solve it. Taking advantage of the parallel nature of this algorithm, we describe how this method can be accelerated using graphical processing units (GPU). Then, we propose a new method for finding defective components on railway tracks using cameras mounted on a train. We describe how to extract features and use a combination of classifiers to solve this problem. Then, we scale anomaly detection to bigger datasets with complex interdependencies. We show that the anomaly detection problem naturally fits in the multitask learning framework. The first task consists of learning a compact representation of the good samples, while the second task consists of learning the anomaly detector. Using deep convolutional neural networks, we show that it is possible to train a deep model with a limited number of anomalous examples. In sequential detection problems, the presence of time-variant nuisance parameters affect the detection performance. In the last part of this dissertation, we present a method for adaptively estimating the threshold of sequential detectors using Extreme Value Theory on a Bayesian framework. Finally, conclusions on the results obtained are provided, followed by a discussion of possible future work.