55 resultados para Multi Domain Information Model
Resumo:
In this paper we extend the minimum-cost network flow approach to multi-target tracking, by incorporating a motion model, allowing the tracker to better cope with longterm occlusions and missed detections. In our new method, the tracking problem is solved iteratively: Firstly, an initial tracking solution is found without the help of motion information. Given this initial set of tracklets, the motion at each detection is estimated, and used to refine the tracking solution.
Finally, special edges are added to the tracking graph, allowing a further revised tracking solution to be found, where distant tracklets may be linked based on motion similarity. Our system has been tested on the PETS S2.L1 and Oxford town-center sequences, outperforming the baseline system, and achieving results comparable with the current state of the art.
Resumo:
Three issues usually are associated with threat prevention intelligent surveillance systems. First, the fusion and interpretation of large scale incomplete heterogeneous information; second, the demand of effectively predicting suspects’ intention and ranking the potential threats posed by each suspect; third, strategies of allocating limited security resources (e.g., the dispatch of security team) to prevent a suspect’s further actions towards critical assets. However, in the literature, these three issues are seldomly considered together in a sensor network based intelligent surveillance framework. To address
this problem, in this paper, we propose a multi-level decision support framework for in-time reaction in intelligent surveillance. More specifically, based on a multi-criteria event modeling framework, we design a method to predict the most plausible intention of a suspect. Following this, a decision support model is proposed to rank each suspect based on their threat severity and to determine resource allocation strategies. Finally, formal properties are discussed to justify our framework.
Resumo:
The BDI architecture, where agents are modelled based on their beliefs, desires and intentions, provides a practical approach to develop large scale systems. However, it is not well suited to model complex Supervisory Control And Data Acquisition (SCADA) systems pervaded by uncertainty. In this paper we address this issue by extending the operational semantics of Can(Plan) into Can(Plan)+. We start by modelling the beliefs of an agent as a set of epistemic states where each state, possibly using a different representation, models part of the agent's beliefs. These epistemic states are stratified to make them commensurable and to reason about the uncertain beliefs of the agent. The syntax and semantics of a BDI agent are extended accordingly and we identify fragments with computationally efficient semantics. Finally, we examine how primitive actions are affected by uncertainty and we define an appropriate form of lookahead planning.
Resumo:
Side-channel analysis of cryptographic systems can allow for the recovery of secret information by an adversary even where the underlying algorithms have been shown to be provably secure. This is achieved by exploiting the unintentional leakages inherent in the underlying implementation of the algorithm in software or hardware. Within this field of research, a class of attacks known as profiling attacks, or more specifically as used here template attacks, have been shown to be extremely efficient at extracting secret keys. Template attacks assume a strong adversarial model, in that an attacker has an identical device with which to profile the power consumption of various operations. This can then be used to efficiently attack the target device. Inherent in this assumption is that the power consumption across the devices under test is somewhat similar. This central tenet of the attack is largely unexplored in the literature with the research community generally performing the profiling stage on the same device as being attacked. This is beneficial for evaluation or penetration testing as it is essentially the best case scenario for an attacker where the model built during the profiling stage matches exactly that of the target device, however it is not necessarily a reflection on how the attack will work in reality. In this work, a large scale evaluation of this assumption is performed, comparing the key recovery performance across 20 identical smart-cards when performing a profiling attack.
Resumo:
We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands gP1, rP1, iP1, and zP1. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.
Resumo:
Traditional experimental economics methods often consume enormous resources of qualified human participants, and the inconsistence of a participant’s decisions among repeated trials prevents investigation from sensitivity analyses. The problem can be solved if computer agents are capable of generating similar behaviors as the given participants in experiments. An experimental economics based analysis method is presented to extract deep information from questionnaire data and emulate any number of participants. Taking the customers’ willingness to purchase electric vehicles (EVs) as an example, multi-layer correlation information is extracted from a limited number of questionnaires. Multi-agents mimicking the inquired potential customers are modelled through matching the probabilistic distributions of their willingness embedded in the questionnaires. The authenticity of both the model and the algorithm is validated by comparing the agent-based Monte Carlo simulation results with the questionnaire-based deduction results. With the aid of agent models, the effects of minority agents with specific preferences on the results are also discussed.
Resumo:
Tanpura string vibrations have been investigated previously using numerical models based on energy conserving schemes derived from a Hamiltonian description in one-dimensional form. Such time-domain models have the property that, for the lossless case, the numerical Hamiltonian (representing total energy of the system) can be proven to be constant from one time step
to the next, irrespective of any of the system parameters; in practice the Hamiltonian can be shown to be conserved within machine precision. Models of this kind can reproduce a jvari effect, which results from the bridge-string interaction. However the one-dimensional formulation has recently been shown to fail to replicate the jvaris strong dependence on the thread placement. As a first step towards simulations which accurately emulate this sensitivity to the thread placement, a twodimensional model is proposed, incorporating coupling of controllable level between the two string polarisations at the string termination opposite from the barrier. In addition, a friction force acting when the string slides across the bridge in horizontal direction is introduced, thus effecting a further damping mechanism. In this preliminary study, the string is terminated at the position of the thread. As in the one-dimensional model, an implicit scheme has to be used to solve the system, employing Newton's method to calculate the updated positions and momentums of each string segment. The two-dimensional model is proven to be energy conserving when the loss parameters are set to zero, irrespective of the coupling constant. Both frequency-dependent and independent losses are then added to the string, so that the model can be compared to analogous instruments. The influence of coupling and the bridge friction are investigated.
Resumo:
In dynamic spectrum access networks, cognitive radio terminals monitor their spectral environment in order to detect and opportunistically access unoccupied frequency channels. The overall performance of such networks depends on the spectrum occupancy or availability patterns. Accurate knowledge on the channel availability enables optimum performance of such networks in terms of spectrum and energy efficiency. This work proposes a novel probabilistic channel availability model that can describe the channel availability in different polarizations for mobile cognitive radio terminals that are likely to change their orientation during their operation. A Gaussian approximation is used to model the empirical occupancy data that was obtained through a measurement campaign in the cellular frequency bands within a realistic operational scenario.
Resumo:
Passive intermodulation (PIM) often limits the performance of communication systems, particularly in the presence of multiple carriers. Since the origins of the apparently multiple physical sources of nonlinearity causing PIM in distributed circuits are not fully understood, the behavioural models are frequently employed to describe the process of PIM generation. In this paper, a memoryless nonlinear polynomial model, capable of predicting high-order multi-carrier intermodulation products, is deduced from the third-order two-tone PIM measurements on a microstrip transmission line with distributed nonlinearity. The analytical model of passive distributed nonlinearity is implemented in Keysight Technology’s ADS simulator to evaluate the adjacent band power ratio for three-tone signals. The obtained results suggest that the costly multi-carrier test setups can possibly be replaced by a simulation tool based on the properly retrieved nonlinear polynomial model.