930 resultados para palpebral fissure anomaly
Resumo:
Java software or libraries can evolve via subclassing. Unfortunately, subclassing may not properly support code adaptation when there are dependencies between classes. More precisely, subclassing in collections of related classes may require reimplementation of otherwise valid classes. This problem is defined as the subclassing anomaly, which is an issue when software evolution or code reuse is a goal of the programmer who is using existing classes. Object Teams offers an implicit fix to this problem and is largely compatible with the existing JVMs. In this paper, we evaluate how well Object Teams succeeds in providing a solution for a complex, real world project. Our results indicate that while Object Teams is a suitable solution for simple examples, it does not meet the requirements for large scale projects. The reasons why Object Teams fails in certain usages may prove useful to those who create linguistic modifications in languages or those who seek new methods for code adaptation.
Resumo:
Subclassing in collections of related classes may require re-implementation of otherwise valid classes just because they utilize outdated parent classes, a phenomenon that is referred to as the subclassing anomaly. The subclassing anomaly is a serious problem since it can void the benefits of code reuse altogether. This paper offers an analysis of the subclassing anomaly in an evolving object-oriented compiler. The paper also outlines a solution for the subclassing anomaly that is based on alternative code reuse mechanism, named class overriding.
Resumo:
Intrusion detection is a critical component of security information systems. The intrusion detection process attempts to detect malicious attacks by examining various data collected during processes on the protected system. This paper examines the anomaly-based intrusion detection based on sequences of system calls. The point is to construct a model that describes normal or acceptable system activity using the classification trees approach. The created database is utilized as a basis for distinguishing the intrusive activity from the legal one using string metric algorithms. The major results of the implemented simulation experiments are presented and discussed as well.
Resumo:
This thesis stems from the project with real-time environmental monitoring company EMSAT Corporation. They were looking for methods to automatically ag spikes and other anomalies in their environmental sensor data streams. The problem presents several challenges: near real-time anomaly detection, absence of labeled data and time-changing data streams. Here, we address this problem using both a statistical parametric approach as well as a non-parametric approach like Kernel Density Estimation (KDE). The main contribution of this thesis is extending the KDE to work more effectively for evolving data streams, particularly in presence of concept drift. To address that, we have developed a framework for integrating Adaptive Windowing (ADWIN) change detection algorithm with KDE. We have tested this approach on several real world data sets and received positive feedback from our industry collaborator. Some results appearing in this thesis have been presented at ECML PKDD 2015 Doctoral Consortium.
Resumo:
Peer reviewed
Resumo:
Diverses œuvres de poésie moderne et contemporaine mettent en scène le rapport à l’écriture d’un sujet lyrique. Une telle problématique trouve une incarnation particulièrement intéressante dans l’œuvre de Patrice Desbiens, notamment dans certains de ses textes des années 1990 et 2000, où elle apparaît avec plus d’acuité. Pourtant, sa pratique auto-réflexive a fait l’objet de très peu de recherches. Afin d’éclairer le rapport qu’entretient Patrice Desbiens avec l’écriture et avec la poésie, ce mémoire s’intéresse à deux de ses textes, soit La fissure de la fiction (1997) et Désâmé, (2005) en accordant davantage d’espace au premier, que je considère comme un texte-charnière dans la production poétique de Desbiens. Dans un premier temps, mon travail présente ainsi la précarité qui caractérise le protagoniste de La fissure de la fiction et, sous un autre angle, le sujet lyrique de Désâmé. Dans cette optique, la figure du poète est étudiée dans La fissure de la fiction à la lumière de la reprise ironique du mythe de la malédiction littéraire et du sens que la réactualisation de ce mythe confère au personnage dans ce récit poétique. Dans un second temps, ce mémoire s’attache à montrer que la cohérence et la vraisemblance des univers mis en scène dans La fissure de la fiction et Désâmé sont minées. C’est à l’aune de ces analyses que peut ensuite être envisagé le rôle d’une poésie qui, en dernière instance, comporte malgré tout un caractère consolateur, en dépit ou en raison de l’esthétique du grotesque, tantôt comique, tantôt tragique, dans laquelle elle s’inscrit et que nous tâcherons de mettre en lumière.
Resumo:
Gravity surveying is challenging in Antarctica because of its hostile environment and inaccessibility. Nevertheless, many ground-based, airborne and shipborne gravity campaigns have been completed by the geophysical and geodetic communities since the 1980s. We present the first modern Antarctic-wide gravity data compilation derived from 13 million data points covering an area of 10 million km**2, which corresponds to 73% coverage of the continent. The remove-compute-restore technique was applied for gridding, which facilitated levelling of the different gravity datasets with respect to an Earth Gravity Model derived from satellite data alone. The resulting free-air and Bouguer gravity anomaly grids of 10 km resolution are publicly available. These grids will enable new high-resolution combined Earth Gravity Models to be derived and represent a major step forward towards solving the geodetic polar data gap problem. They provide a new tool to investigate continental-scale lithospheric structure and geological evolution of Antarctica.
Resumo:
The deep sea sedimentary record is an archive of the pre-glacial to glacial development of Antarctica and changes in climate, tectonics and ocean circulation. Identification of the pre-glacial, transitional and full glacial components in the sedimentary record is necessary for ice sheet reconstruction and to build circum-Antarctic sediment thickness grids for past topography and bathymetry reconstructions, which constrain paleoclimate models. A ~3300 km long Weddell Sea to Scotia Sea transect consisting of multichannel seismic reflection data from various organisations, were used to interpret new horizons to define the initial basin-wide seismostratigraphy and to identify the pre-glacial to glacial components. We mapped seven main units of which three are in the inferred Cretaceous-Paleocene pre-glacial regime, one in the Eocene-Oligocene transitional regime and three units in the Miocene-Pleistocene full glacial climate regime. Sparse borehole data from ODP leg 113 and SHALDRIL constrain the ages of the upper three units. Compiled seafloor spreading magnetic anomalies constrain the basement ages and the hypothetical age model. In many cases, the new horizons and stratigraphy contradict the interpretations in local studies. Each seismic sedimentary unit and its associated base horizon are continuous and traceable for the entire transect length, but reflect a lateral change in age whilst representing the same deposition process. The up to 1240 m thick pre-glacial seismic units form a mound in the central Weddell Sea basin and, in conjunction with the eroded flank geometry, support the interpretation of a Cretaceous proto-Weddell Gyre. The base reflector of the transitional seismic unit, which marks the initial ice sheet advances to the outer shelf, has a lateral model age of 26.6-15.5 Ma from southeast to northwest. The Pliocene-Pleistocene glacial deposits reveals lower sedimentations rates, indicating a reduced sediment supply. Sedimentation rates for the pre-glacial, transitional and full glacial components are highest around the Antarctic Peninsula, indicating higher erosion and sediment supply of a younger basement. We interpret an Eocene East Antarctic Ice Sheet expansion, Oligocene grounding of the West Antarctic Ice Sheet and Early Miocene grounding of the Antarctic Peninsula Ice Sheet.
Resumo:
FPGAs and GPUs are often used when real-time performance in video processing is required. An accelerated processor is chosen based on task-specific priorities (power consumption, processing time and detection accuracy), and this decision is normally made once at design time. All three characteristics are important, particularly in battery-powered systems. Here we propose a method for moving selection of processing platform from a single design-time choice to a continuous run time one.We implement Histogram of Oriented Gradients (HOG) detectors for cars and people and Mixture of Gaussians (MoG) motion detectors running across FPGA, GPU and CPU in a heterogeneous system. We use this to detect illegally parked vehicles in urban scenes. Power, time and accuracy information for each detector is characterised. An anomaly measure is assigned to each detected object based on its trajectory and location, when compared to learned contextual movement patterns. This drives processor and implementation selection, so that scenes with high behavioural anomalies are processed with faster but more power hungry implementations, but routine or static time periods are processed with power-optimised, less accurate, slower versions. Real-time performance is evaluated on video datasets including i-LIDS. Compared to power-optimised static selection, automatic dynamic implementation mapping is 10% more accurate but draws 12W extra power in our testbed desktop system.
Resumo:
This work addresses the problem of detecting human behavioural anomalies in crowded surveillance environments. We focus in particular on the problem of detecting subtle anomalies in a behaviourally heterogeneous surveillance scene. To reach this goal we implement a novel unsupervised context-aware process. We propose and evaluate a method of utilising social context and scene context to improve behaviour analysis. We find that in a crowded scene the application of Mutual Information based social context permits the ability to prevent self-justifying groups and propagate anomalies in a social network, granting a greater anomaly detection capability. Scene context uniformly improves the detection of anomalies in both datasets. The strength of our contextual features is demonstrated by the detection of subtly abnormal behaviours, which otherwise remain indistinguishable from normal behaviour.
Resumo:
OBJECTIVES: The aim of this study was to describe the epidemiology of Ebstein's anomaly in Europe and its association with maternal health and medication exposure during pregnancy.
DESIGN: We carried out a descriptive epidemiological analysis of population-based data.
SETTING: We included data from 15 European Surveillance of Congenital Anomalies Congenital Anomaly Registries in 12 European countries, with a population of 5.6 million births during 1982-2011. Participants Cases included live births, fetal deaths from 20 weeks gestation, and terminations of pregnancy for fetal anomaly. Main outcome measures We estimated total prevalence per 10,000 births. Odds ratios for exposure to maternal illnesses/medications in the first trimester of pregnancy were calculated by comparing Ebstein's anomaly cases with cardiac and non-cardiac malformed controls, excluding cases with genetic syndromes and adjusting for time period and country.
RESULTS: In total, 264 Ebstein's anomaly cases were recorded; 81% were live births, 2% of which were diagnosed after the 1st year of life; 54% of cases with Ebstein's anomaly or a co-existing congenital anomaly were prenatally diagnosed. Total prevalence rose over time from 0.29 (95% confidence interval (CI) 0.20-0.41) to 0.48 (95% CI 0.40-0.57) (p<0.01). In all, nine cases were exposed to maternal mental health conditions/medications (adjusted odds ratio (adjOR) 2.64, 95% CI 1.33-5.21) compared with cardiac controls. Cases were more likely to be exposed to maternal β-thalassemia (adjOR 10.5, 95% CI 3.13-35.3, n=3) and haemorrhage in early pregnancy (adjOR 1.77, 95% CI 0.93-3.38, n=11) compared with cardiac controls.
CONCLUSIONS: The increasing prevalence of Ebstein's anomaly may be related to better and earlier diagnosis. Our data suggest that Ebstein's anomaly is associated with maternal mental health problems generally rather than lithium or benzodiazepines specifically; therefore, changing or stopping medications may not be preventative. We found new associations requiring confirmation.
Resumo:
To maintain the pace of development set by Moore's law, production processes in semiconductor manufacturing are becoming more and more complex. The development of efficient and interpretable anomaly detection systems is fundamental to keeping production costs low. As the dimension of process monitoring data can become extremely high anomaly detection systems are impacted by the curse of dimensionality, hence dimensionality reduction plays an important role. Classical dimensionality reduction approaches, such as Principal Component Analysis, generally involve transformations that seek to maximize the explained variance. In datasets with several clusters of correlated variables the contributions of isolated variables to explained variance may be insignificant, with the result that they may not be included in the reduced data representation. It is then not possible to detect an anomaly if it is only reflected in such isolated variables. In this paper we present a new dimensionality reduction technique that takes account of such isolated variables and demonstrate how it can be used to build an interpretable and robust anomaly detection system for Optical Emission Spectroscopy data.