991 resultados para Rare events


Relevância:

100.00% 100.00%

Publicador:

Resumo:

One main challenge in developing a system for visual surveillance event detection is the annotation of target events in the training data. By making use of the assumption that events with security interest are often rare compared to regular behaviours, this paper presents a novel approach by using Kullback-Leibler (KL) divergence for rare event detection in a weakly supervised learning setting, where only clip-level annotation is available. It will be shown that this approach outperforms state-of-the-art methods on a popular real-world dataset, while preserving real time performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Video surveillance infrastructure has been widely installed in public places for security purposes. However, live video feeds are typically monitored by human staff, making the detection of important events as they occur difficult. As such, an expert system that can automatically detect events of interest in surveillance footage is highly desirable. Although a number of approaches have been proposed, they have significant limitations: supervised approaches, which can detect a specific event, ideally require a large number of samples with the event spatially and temporally localised; while unsupervised approaches, which do not require this demanding annotation, can only detect whether an event is abnormal and not specific event types. To overcome these problems, we formulate a weakly-supervised approach using Kullback-Leibler (KL) divergence to detect rare events. The proposed approach leverages the sparse nature of the target events to its advantage, and we show that this data imbalance guarantees the existence of a decision boundary to separate samples that contain the target event from those that do not. This trait, combined with the coarse annotation used by weakly supervised learning (that only indicates approximately when an event occurs), greatly reduces the annotation burden while retaining the ability to detect specific events. Furthermore, the proposed classifier requires only a decision threshold, simplifying its use compared to other weakly supervised approaches. We show that the proposed approach outperforms state-of-the-art methods on a popular real-world traffic surveillance dataset, while preserving real time performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estimating rare events from zero-heavy data (data with many zero values) is a common challenge in fisheries science and ecology. For example, loggerhead sea turtles (Caretta caretta) and leatherback sea turtles (Dermochelys coriacea) account for less than 1% of total catch in the U.S. Atlantic pelagic longline fishery. Nevertheless, the Southeast Fisheries Science Center (SEFSC) of the National Marine Fisheries Service (NMFS) is charged with assessing the effect of this fishery on these federally protected species. Annual estimates of loggerhead and leatherback bycatch in a fishery can affect fishery management and species conservation decisions. However, current estimates have wide confidence intervals, and their accuracy is unknown. We evaluate 3 estimation methods, each at 2 spatiotemporal scales, in simulations of 5 spatial scenarios representing incidental capture of sea turtles by the U.S. Atlantic pelagic longline fishery. The delta-log normal method of estimating bycatch for calendar quarter and fishing area strata was the least biased estimation method in the spatial scenarios believed to be most realistic. This result supports the current estimation procedure used by the SEFSC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We aim to provide a review of the stochastic discount factor bounds usually applied to diagnose asset pricing models. In particular, we mainly discuss the bounds used to analyze the disaster model of Barro (2006). Our attention is focused in this disaster model since the stochastic discount factor bounds that are applied to study the performance of disaster models usually consider the approach of Barro (2006). We first present the entropy bounds that provide a diagnosis of the analyzed disaster model which are the methods of Almeida and Garcia (2012, 2016); Ghosh et al. (2016). Then, we discuss how their results according to the disaster model are related to each other and also present the findings of other methodologies that are similar to these bounds but provide different evidence about the performance of the framework developed by Barro (2006).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stochastic simulation is an important and practical technique for computing probabilities of rare events, like the payoff probability of a financial option, the probability that a queue exceeds a certain level or the probability of ruin of the insurer's risk process. Rare events occur so infrequently, that they cannot be reasonably recorded during a standard simulation procedure: specifc simulation algorithms which thwart the rarity of the event to simulate are required. An important algorithm in this context is based on changing the sampling distribution and it is called importance sampling. Optimal Monte Carlo algorithms for computing rare event probabilities are either logarithmic eficient or possess bounded relative error.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predicting failures in a distributed system based on previous events through logistic regression is a standard approach in literature. This technique is not reliable, though, in two situations: in the prediction of rare events, which do not appear in enough proportion for the algorithm to capture, and in environments where there are too many variables, as logistic regression tends to overfit on this situations; while manually selecting a subset of variables to create the model is error- prone. On this paper, we solve an industrial research case that presented this situation with a combination of elastic net logistic regression, a method that allows us to automatically select useful variables, a process of cross-validation on top of it and the application of a rare events prediction technique to reduce computation time. This process provides two layers of cross- validation that automatically obtain the optimal model complexity and the optimal mode l parameters values, while ensuring even rare events will be correctly predicted with a low amount of training instances. We tested this method against real industrial data, obtaining a total of 60 out of 80 possible models with a 90% average model accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Temperature chaos has often been reported in the literature as a rare-event–driven phenomenon. However, this fact has always been ignored in the data analysis, thus erasing the signal of the chaotic behavior (still rare in the sizes achieved) and leading to an overall picture of a weak and gradual phenomenon. On the contrary, our analysis relies on a largedeviations functional that allows to discuss the size dependences. In addition, we had at our disposal unprecedentedly large configurations equilibrated at low temperatures, thanks to the Janus computer. According to our results, when temperature chaos occurs its effects are strong and can be felt even at short distances.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper provides a novel Exceptional Object Analysis for Finding Rare Environmental Events (EOAFREE). The major contribution of our EOAFREE method is that it proposes a general Improved Exceptional Object Analysis based on Noises (IEOAN) algorithm to efficiently detect and rank exceptional objects. Our IEOAN algorithm is more general than already known outlier detection algorithms to find exceptional objects that may be not on the border; and experimental study shows that our IEOAN algorithm is far more efficient than directly recursively using already known clustering algorithms that may not force every data instance to belong to a cluster to detect rare events. Another contribution is that it provides an approach to preprocess heterogeneous real world data through exploring domain knowledge, based on which it defines changes instead of the water data value itself as the input of the IEOAN algorithm to remove the geographical differences between any two sites and the temporal differences between any two years. The effectiveness of our EOAFREE method is demonstrated by a real world application - that is, to detect water pollution events from the water quality datasets of 93 sites distributed in 10 river basins in Victoria, Australia between 1975 and 2010. © 2012 Elsevier B.V..

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The terrorist attacks in the United States on September 11, 2001 appeared to be a harbinger of increased terrorism and violence in the 21st century, bringing terrorism and political violence to the forefront of public discussion. Questions about these events abound, and “Estimating the Historical and Future Probabilities of Large Scale Terrorist Event” [Clauset and Woodard (2013)] asks specifically, “how rare are large scale terrorist events?” and, in general, encourages discussion on the role of quantitative methods in terrorism research and policy and decision-making. Answering the primary question raises two challenges. The first is identify- ing terrorist events. The second is finding a simple yet robust model for rare events that has good explanatory and predictive capabilities. The challenges of identifying terrorist events is acknowledged and addressed by reviewing and using data from two well-known and reputable sources: the Memorial Institute for the Prevention of Terrorism-RAND database (MIPT-RAND) [Memorial Institute for the Prevention of Terrorism] and the Global Terror- ism Database (GTD) [National Consortium for the Study of Terrorism and Responses to Terrorism (START) (2012), LaFree and Dugan (2007)]. Clauset and Woodard (2013) provide a detailed discussion of the limitations of the data and the models used, in the context of the larger issues surrounding terrorism and policy.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Intense extra-tropical cyclones are often associated with strong winds, heavy precipitation and socio-economic impacts. Over southwestern Europe, such storms occur less often, but still cause high economic losses. We characterise the largescale atmospheric conditions and cyclone tracks during the top-100 potential losses over Iberia associated with wind events. Based on 65 years of reanalysis data,events are classified into four groups: (i) cyclone tracks crossing over Iberia on the event day (“Iberia”), (ii) cyclones crossing further north, typically southwest of the British Isles (“North”), (iii) cyclones crossing southwest to northeast near the northwest tip of Iberia (“West”), and (iv) so called “Hybrids”, characterised by a strong pressure gradient over Iberia due to the juxtaposition of low and high pressure centres. Generally, “Iberia” events are the most frequent (31% to 45% for top-100 vs.top-20), while “West” events are rare (10% to 12%). 70% of the events were primarily associated with a cyclone. Multi-decadal variability in the number of events is identified. While the peak in recent years is quite prominent, other comparably stormy periods occurred in the 1960s and 1980s. This study documents that damaging wind storms over Iberia are not rare events, and their frequency of occurrence undergoes strong multi-decadal variability.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We present a novel method, called the transform likelihood ratio (TLR) method, for estimation of rare event probabilities with heavy-tailed distributions. Via a simple transformation ( change of variables) technique the TLR method reduces the original rare event probability estimation with heavy tail distributions to an equivalent one with light tail distributions. Once this transformation has been established we estimate the rare event probability via importance sampling, using the classical exponential change of measure or the standard likelihood ratio change of measure. In the latter case the importance sampling distribution is chosen from the same parametric family as the transformed distribution. We estimate the optimal parameter vector of the importance sampling distribution using the cross-entropy method. We prove the polynomial complexity of the TLR method for certain heavy-tailed models and demonstrate numerically its high efficiency for various heavy-tailed models previously thought to be intractable. We also show that the TLR method can be viewed as a universal tool in the sense that not only it provides a unified view for heavy-tailed simulation but also can be efficiently used in simulation with light-tailed distributions. We present extensive simulation results which support the efficiency of the TLR method.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stochastic models for competing clonotypes of T cells by multivariate, continuous-time, discrete state, Markov processes have been proposed in the literature by Stirk, Molina-París and van den Berg (2008). A stochastic modelling framework is important because of rare events associated with small populations of some critical cell types. Usually, computational methods for these problems employ a trajectory-based approach, based on Monte Carlo simulation. This is partly because the complementary, probability density function (PDF) approaches can be expensive but here we describe some efficient PDF approaches by directly solving the governing equations, known as the Master Equation. These computations are made very efficient through an approximation of the state space by the Finite State Projection and through the use of Krylov subspace methods when evolving the matrix exponential. These computational methods allow us to explore the evolution of the PDFs associated with these stochastic models, and bimodal distributions arise in some parameter regimes. Time-dependent propensities naturally arise in immunological processes due to, for example, age-dependent effects. Incorporating time-dependent propensities into the framework of the Master Equation significantly complicates the corresponding computational methods but here we describe an efficient approach via Magnus formulas. Although this contribution focuses on the example of competing clonotypes, the general principles are relevant to multivariate Markov processes and provide fundamental techniques for computational immunology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Even though crashes between trains and road users are rare events at railway level crossings, they are one of the major safety concerns for the Australian railway industry. Nearmiss events at level crossings occur more frequently, and can provide more information about factors leading to level crossing incidents. In this paper we introduce a video analytic approach for automatically detecting and localizing vehicles from cameras mounted on trains for detecting near-miss events. To detect and localize vehicles at level crossings we extract patches from an image and classify each patch for detecting vehicles. We developed a region proposals algorithm for generating patches, and we use a Convolutional Neural Network (CNN) for classifying each patch. To localize vehicles in images we combine the patches that are classified as vehicles according to their CNN scores and positions. We compared our system with the Deformable Part Models (DPM) and Regions with CNN features (R-CNN) object detectors. Experimental results on a railway dataset show that the recall rate of our proposed system is 29% higher than what can be achieved with DPM or R-CNN detectors.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Current biosecurity arrangements for plantation forestry are poorly defined, at least relative to other plant-based industries. Serious pest and disease outbreaks in forestry are relatively rare events. Preparedness for rare events is difficult. Part of the difficulty stems from the competing views of managers and stakeholders. This project sought to directly capture alternative views concerning the key objectives of plantation forest biosecurity, alternative strategies for achieving those objectives, and ultimately recommend preferred actions that might be broadly supported by stakeholders. The outcomes from the workshop were used as a basis to draft a list of strategic actions required to improve forest biosecurity in Australia and to be implemented over the next 2-5 years. Research priorities were identified as research to support cost benefit analyses; investigating the effects of changed environmental conditions on forest biosecurity; pathway analysis for functional pest guilds. Integration of this research within a CRC would also permit the effective development and extension of this research as well as providing training urgently required to maintain forest biosecurity and health expertise.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Human parvovirus B19 (B19V) is known to cause anemia, hydrops fetalis, and fetal death especially during the first half of pregnancy. Women who are in occupational contact with young children are at increased risk of B19V infection. The role of the recently discovered human parvovirus, human bocavirus (HBoV), in reproduction is unknown. The aim of this research project was to establish a scientific basis for assessing the work safety of pregnant women and for issuing special maternity leave regulations during B19V epidemics in Finland. The impact of HBoV infection on the pregnant woman and her fetus was also defined. B19V DNA was found in 0.8% of the miscarriages and in 2.4% of the intrauterine fetal death (IUFD; fetal death after completed 22 gestational weeks). All control fetuses (from induced abortions) were B19V-DNA negative. The findings on hydropic B19V DNA-positive IUFDs with evidence of acute or recent maternal B19V infection are in line with those of previous Swedish studies. However, the high prevalence of B19V-related nonhydropic IUFDs noted in the Swedish studies was mostly without evidence of maternal B19V infection and was not found during the third trimester. HBoV was not associated with miscarriages or IUFDs. Almost all of the studied pregnant women were HboV-IgG positive, and thus most probably immune to HBoV. All preterm births, perinatal deaths, smallness for gestational age (SGA) and congenital anomaly were recorded among the infants of child-care employees in a nationwide register-based cohort study over a period of 14 years. Little or no differences in the results were found between the infants of the child-care employees and those of the comparison group. The annual B19V seroconversion rate was over two-fold among the child-care employees, compared to the women in the comparison group. The seropositivity of the child-care employees increased with age, and years from qualification/joining the trade union. In general, the child-care employees are not at increased risk for adverse pregnancy outcome. However, at the population level, the risk of rare events, such as adverse pregnancy outcomes attributed to infections, could not be determined. According to previous studies, seronegative women had a 5 10% excess risk of losing the fetus during the first half of their pregnancy, but thereafter the risk was very low. Therefore, an over two-fold increased risk of B19V infection among child-care employees is considerable, and should be taken into account in the assessment of the occupational safety of pregnant women, especially during the first half of their pregnancy.