45 resultados para Anomaly

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a self-consistent magnetic tight-binding theory based in an expansion of the Hohenberg-Kohn density functional to second order, about a non-spin-polarized reference density. We show how a first order expansion about a density having a trial input magnetic moment leads to a fixed moment model. We employ a simple set of tight-binding parameters that accurately describes electronic structure and energetics, and show these to be transferable between first row transition metals and their alloys. We make a number of calculations of the electronic structure of dilute Cr impurities in Fe, which we compare with results using the local spin density approximation. The fixed moment model provides a powerful means for interpreting complex magnetic configurations in alloys; using this approach, we are able to advance a simple and readily understood explanation for the observed anomaly in the enthalpy of mixing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For many decades it has been assumed that an adsorbate centered above a metal surface and with a net negative charge should increase the work function of the surface. However, despite their electronegativity, N adatoms on W{100} cause a significant work function decrease. Here we present a resolution of this anomaly. Using density functional theory, we demonstrate that while the N atom carries a negative charge, of overriding importance is a reduction in the surface overspill electron density into the vacuum, when that charge is engaged in bonding to the adatom. This novel interpretation is fundamentally important in the general understanding of work function changes induced by atomic adsorbates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The light curve of PA-99-N2, one of the recently announced microlensing candidates toward M31, shows small deviations from the standard Paczynski form. We explore a number of possible explanations, including correlations with the seeing, the parallax effect, and a binary lens. We find that the observations are consistent with an unresolved red giant branch or asymptotic giant branch star in M31 being microlensed by a binary lens. We find that the best-fit binary lens mass ratio is similar to1.2x10(-2), which is one of the most extreme values found for a binary lens so far. If both the source and lens lie in the M31 disk, then the standard M31 model predicts the probable mass range of the system to be 0.02-3.6 M-circle dot (95% confidence limit). In this scenario, the mass of the secondary component is therefore likely to be below the hydrogen-burning limit. On the other hand, if a compact halo object in M31 is lensing a disk or spheroid source, then the total lens mass is likely to lie between 0.09 and 32 M-circle dot, which is consistent with the primary being a stellar remnant and the secondary being a low-mass star or brown dwarf. The optical depth (or, alternatively, the differential rate) along the line of sight toward the event indicates that a halo lens is more likely than a stellar lens, provided that dark compact objects comprise no less than 15% (or 5%) of halos.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The electronic stopping power of H and He moving through gold is obtained to high accuracy using time-evolving density-functional theory, thereby bringing usual first principles accuracies into this kind of strongly coupled, continuum nonadiabatic processes in condensed matter. The two key unexplained features of what observed experimentally have been reproduced and understood: (i)The nonlinear behavior of stopping power versus velocity is a gradual crossover as excitations tail into the d-electron spectrum; and (ii)the low-velocity H/He anomaly (the relative stopping powers are contrary to established theory) is explained by the substantial involvement of the d electrons in the screening of the projectile even at the lowest velocities where the energy loss is generated by s-like electron-hole pair formation only. © 2012 American Physical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of detecting spatially-coherent groups of data that exhibit anomalous behavior has started to attract attention due to applications across areas such as epidemic analysis and weather forecasting. Earlier efforts from the data mining community have largely focused on finding outliers, individual data objects that display deviant behavior. Such point-based methods are not easy to extend to find groups of data that exhibit anomalous behavior. Scan Statistics are methods from the statistics community that have considered the problem of identifying regions where data objects exhibit a behavior that is atypical of the general dataset. The spatial scan statistic and methods that build upon it mostly adopt the framework of defining a character for regions (e.g., circular or elliptical) of objects and repeatedly sampling regions of such character followed by applying a statistical test for anomaly detection. In the past decade, there have been efforts from the statistics community to enhance efficiency of scan statstics as well as to enable discovery of arbitrarily shaped anomalous regions. On the other hand, the data mining community has started to look at determining anomalous regions that have behavior divergent from their neighborhood.In this chapter,we survey the space of techniques for detecting anomalous regions on spatial data from across the data mining and statistics communities while outlining connections to well-studied problems in clustering and image segmentation. We analyze the techniques systematically by categorizing them appropriately to provide a structured birds eye view of the work on anomalous region detection;we hope that this would encourage better cross-pollination of ideas across communities to help advance the frontier in anomaly detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cloud data centres are implemented as large-scale clusters with demanding requirements for service performance, availability and cost of operation. As a result of scale and complexity, data centres typically exhibit large numbers of system anomalies resulting from operator error, resource over/under provisioning, hardware or software failures and security issus anomalies are inherently difficult to identify and resolve promptly via human inspection. Therefore, it is vital in a cloud system to have automatic system monitoring that detects potential anomalies and identifies their source. In this paper we present a lightweight anomaly detection tool for Cloud data centres which combines extended log analysis and rigorous correlation of system metrics, implemented by an efficient correlation algorithm which does not require training or complex infrastructure set up. The LADT algorithm is based on the premise that there is a strong correlation between node level and VM level metrics in a cloud system. This correlation will drop significantly in the event of any performance anomaly at the node-level and a continuous drop in the correlation can indicate the presence of a true anomaly in the node. The log analysis of LADT assists in determining whether the correlation drop could be caused by naturally occurring cloud management activity such as VM migration, creation, suspension, termination or resizing. In this way, any potential anomaly alerts are reasoned about to prevent false positives that could be caused by the cloud operator’s activity. We demonstrate LADT with log analysis in a Cloud environment to show how the log analysis is combined with the correlation of systems metrics to achieve accurate anomaly detection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

FPGAs and GPUs are often used when real-time performance in video processing is required. An accelerated processor is chosen based on task-specific priorities (power consumption, processing time and detection accuracy), and this decision is normally made once at design time. All three characteristics are important, particularly in battery-powered systems. Here we propose a method for moving selection of processing platform from a single design-time choice to a continuous run time one.We implement Histogram of Oriented Gradients (HOG) detectors for cars and people and Mixture of Gaussians (MoG) motion detectors running across FPGA, GPU and CPU in a heterogeneous system. We use this to detect illegally parked vehicles in urban scenes. Power, time and accuracy information for each detector is characterised. An anomaly measure is assigned to each detected object based on its trajectory and location, when compared to learned contextual movement patterns. This drives processor and implementation selection, so that scenes with high behavioural anomalies are processed with faster but more power hungry implementations, but routine or static time periods are processed with power-optimised, less accurate, slower versions. Real-time performance is evaluated on video datasets including i-LIDS. Compared to power-optimised static selection, automatic dynamic implementation mapping is 10% more accurate but draws 12W extra power in our testbed desktop system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work addresses the problem of detecting human behavioural anomalies in crowded surveillance environments. We focus in particular on the problem of detecting subtle anomalies in a behaviourally heterogeneous surveillance scene. To reach this goal we implement a novel unsupervised context-aware process. We propose and evaluate a method of utilising social context and scene context to improve behaviour analysis. We find that in a crowded scene the application of Mutual Information based social context permits the ability to prevent self-justifying groups and propagate anomalies in a social network, granting a greater anomaly detection capability. Scene context uniformly improves the detection of anomalies in both datasets. The strength of our contextual features is demonstrated by the detection of subtly abnormal behaviours, which otherwise remain indistinguishable from normal behaviour.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: The aim of this study was to describe the epidemiology of Ebstein's anomaly in Europe and its association with maternal health and medication exposure during pregnancy.

DESIGN: We carried out a descriptive epidemiological analysis of population-based data.

SETTING: We included data from 15 European Surveillance of Congenital Anomalies Congenital Anomaly Registries in 12 European countries, with a population of 5.6 million births during 1982-2011. Participants Cases included live births, fetal deaths from 20 weeks gestation, and terminations of pregnancy for fetal anomaly. Main outcome measures We estimated total prevalence per 10,000 births. Odds ratios for exposure to maternal illnesses/medications in the first trimester of pregnancy were calculated by comparing Ebstein's anomaly cases with cardiac and non-cardiac malformed controls, excluding cases with genetic syndromes and adjusting for time period and country.

RESULTS: In total, 264 Ebstein's anomaly cases were recorded; 81% were live births, 2% of which were diagnosed after the 1st year of life; 54% of cases with Ebstein's anomaly or a co-existing congenital anomaly were prenatally diagnosed. Total prevalence rose over time from 0.29 (95% confidence interval (CI) 0.20-0.41) to 0.48 (95% CI 0.40-0.57) (p<0.01). In all, nine cases were exposed to maternal mental health conditions/medications (adjusted odds ratio (adjOR) 2.64, 95% CI 1.33-5.21) compared with cardiac controls. Cases were more likely to be exposed to maternal β-thalassemia (adjOR 10.5, 95% CI 3.13-35.3, n=3) and haemorrhage in early pregnancy (adjOR 1.77, 95% CI 0.93-3.38, n=11) compared with cardiac controls.

CONCLUSIONS: The increasing prevalence of Ebstein's anomaly may be related to better and earlier diagnosis. Our data suggest that Ebstein's anomaly is associated with maternal mental health problems generally rather than lithium or benzodiazepines specifically; therefore, changing or stopping medications may not be preventative. We found new associations requiring confirmation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To maintain the pace of development set by Moore's law, production processes in semiconductor manufacturing are becoming more and more complex. The development of efficient and interpretable anomaly detection systems is fundamental to keeping production costs low. As the dimension of process monitoring data can become extremely high anomaly detection systems are impacted by the curse of dimensionality, hence dimensionality reduction plays an important role. Classical dimensionality reduction approaches, such as Principal Component Analysis, generally involve transformations that seek to maximize the explained variance. In datasets with several clusters of correlated variables the contributions of isolated variables to explained variance may be insignificant, with the result that they may not be included in the reduced data representation. It is then not possible to detect an anomaly if it is only reflected in such isolated variables. In this paper we present a new dimensionality reduction technique that takes account of such isolated variables and demonstrate how it can be used to build an interpretable and robust anomaly detection system for Optical Emission Spectroscopy data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective Within the framework of a health technology assessment and using an economic model, to determine the most clinically and cost effective policy of scanning and screening for fetal abnormalities in early pregnancy. Design A discrete event simulation model of 50,000 singleton pregnancies. Setting Maternity services in Scotland. Population Women during the first 24 weeks of their pregnancy. Methods The mathematical model was populated with data on uptake of screening, prevalence, detection and false positive rates for eight fetal abnormalities and with costs for ultrasound scanning and serum screening. Inclusion of abnormalities was based on the relative prevalence and clinical importance of conditions and the availability of data. Six strategies for the identification of abnormalities prenatally including combinations of first and second trimester ultrasound scanning and first and second trimester screening for chromosomal abnormalities were compared. Main outcome measures The number of abnormalities detected and missed, the number of iatrogenic losses resulting from invasive tests, the total cost of strategies and the cost per abnormality detected were compared between strategies. Results First trimester screening for chromosomal abnormalities costs more than second trimester screening but results in fewer iatrogenic losses. Strategies which include a second trimester ultrasound scan result in more abnormalities being detected and have lower costs per anomaly detected. Conclusions The preferred strategy includes both first and second trimester ultrasound scans and a first trimester screening test for chromosomal abnormalities. It has been recommended that this policy is offered to all women in Scotland.