973 resultados para Robust localisation systems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper extends the authors' earlier work which adapted robust multiplexed MPC for application to distributed control of multi-agent systems with non-interacting dynamics and coupled constraint sets in the presence of persistent unknown, but bounded disturbances. Specifically, we propose exploiting the single agent update nature of the multiplexed approach, and fix the update sequence to enable input move-blocking and increased discretisation rates. This permits a higher rate of individual policy update to be achieved, whilst incurring no additional computational cost in the corresponding optimal control problems to be solved. A disturbance feedback policy is included between updates to facilitate finding feasible solutions. The new formulation inherits the property of rapid response to disturbances from multiplexing the control and numerical results show that fixing the update sequence does not incur any loss in performance. © 2011 IFAC.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The double neutron/proton ratio of nucleon emissions taken from two reaction systems using four isotopes of the same element, namely, the neutron/proton ratio in the neutron-rich system over that in the more symmetric system, has the advantage of reducing systematically the influence of the Coulomb force and the normally poor efficiencies of detecting low energy neutrons. The double ratio thus suffers less systematic errors. Within the IBUU04 transport model the double neutron/proton ratio is shown to have about the same sensitivity to the density dependence of nuclear symmetry energy as the single neutron/proton ratio in the neutron-rich system involved. The double neutron/proton ratio is therefore more useful for further constraining the symmetry energy of neutron-rich matter.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Malicious software (malware) have significantly increased in terms of number and effectiveness during the past years. Until 2006, such software were mostly used to disrupt network infrastructures or to show coders’ skills. Nowadays, malware constitute a very important source of economical profit, and are very difficult to detect. Thousands of novel variants are released every day, and modern obfuscation techniques are used to ensure that signature-based anti-malware systems are not able to detect such threats. This tendency has also appeared on mobile devices, with Android being the most targeted platform. To counteract this phenomenon, a lot of approaches have been developed by the scientific community that attempt to increase the resilience of anti-malware systems. Most of these approaches rely on machine learning, and have become very popular also in commercial applications. However, attackers are now knowledgeable about these systems, and have started preparing their countermeasures. This has lead to an arms race between attackers and developers. Novel systems are progressively built to tackle the attacks that get more and more sophisticated. For this reason, a necessity grows for the developers to anticipate the attackers’ moves. This means that defense systems should be built proactively, i.e., by introducing some security design principles in their development. The main goal of this work is showing that such proactive approach can be employed on a number of case studies. To do so, I adopted a global methodology that can be divided in two steps. First, understanding what are the vulnerabilities of current state-of-the-art systems (this anticipates the attacker’s moves). Then, developing novel systems that are robust to these attacks, or suggesting research guidelines with which current systems can be improved. This work presents two main case studies, concerning the detection of PDF and Android malware. The idea is showing that a proactive approach can be applied both on the X86 and mobile world. The contributions provided on this two case studies are multifolded. With respect to PDF files, I first develop novel attacks that can empirically and optimally evade current state-of-the-art detectors. Then, I propose possible solutions with which it is possible to increase the robustness of such detectors against known and novel attacks. With respect to the Android case study, I first show how current signature-based tools and academically developed systems are weak against empirical obfuscation attacks, which can be easily employed without particular knowledge of the targeted systems. Then, I examine a possible strategy to build a machine learning detector that is robust against both empirical obfuscation and optimal attacks. Finally, I will show how proactive approaches can be also employed to develop systems that are not aimed at detecting malware, such as mobile fingerprinting systems. In particular, I propose a methodology to build a powerful mobile fingerprinting system, and examine possible attacks with which users might be able to evade it, thus preserving their privacy. To provide the aforementioned contributions, I co-developed (with the cooperation of the researchers at PRALab and Ruhr-Universität Bochum) various systems: a library to perform optimal attacks against machine learning systems (AdversariaLib), a framework for automatically obfuscating Android applications, a system to the robust detection of Javascript malware inside PDF files (LuxOR), a robust machine learning system to the detection of Android malware, and a system to fingerprint mobile devices. I also contributed to develop Android PRAGuard, a dataset containing a lot of empirical obfuscation attacks against the Android platform. Finally, I entirely developed Slayer NEO, an evolution of a previous system to the detection of PDF malware. The results attained by using the aforementioned tools show that it is possible to proactively build systems that predict possible evasion attacks. This suggests that a proactive approach is crucial to build systems that provide concrete security against general and evasion attacks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes a methodology for deploying flexible dynamic configuration into embedded systems whilst preserving the reliability advantages of static systems. The methodology is based on the concept of decision points (DP) which are strategically placed to achieve fine-grained distribution of self-management logic to meet application-specific requirements. DP logic can be changed easily, and independently of the host component, enabling self-management behavior to be deferred beyond the point of system deployment. A transparent Dynamic Wrapper mechanism (DW) automatically detects and handles problems arising from the evaluation of self-management logic within each DP and ensures that the dynamic aspects of the system collapse down to statically defined default behavior to ensure safety and correctness despite failures. Dynamic context management contributes to flexibility, and removes the need for design-time binding of context providers and consumers, thus facilitating run-time composition and incremental component upgrade.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Co-localisation is a widely used measurement in immunohistochemical analysis to determine if fluorescently labelled biological entities, such as cells, proteins or molecules share a same location. However the measurement of co-localisation is challenging due to the complex nature of such fluorescent images, especially when multiple focal planes are captured. The current state-of-art co-localisation measurements of 3-dimensional (3D) image stacks are biased by noise and cross-overs from non-consecutive planes.

Method: In this study, we have developed Co-localisation Intensity Coefficients (CICs) and Co-localisation Binary Coefficients (CBCs), which uses rich z-stack data from neighbouring focal planes to identify similarities between image intensities of two and potentially more fluorescently-labelled biological entities. This was developed using z-stack images from murine organotypic slice cultures from central nervous system tissue, and two sets of pseudo-data. A large amount of non-specific cross-over situations are excluded using this method. This proposed method is also proven to be robust in recognising co-localisations even when images are polluted with a range of noises.

Results: The proposed CBCs and CICs produce robust co-localisation measurements which are easy to interpret, resilient to noise and capable of removing a large amount of false positivity, such as non-specific cross-overs. Performance of this method of measurement is significantly more accurate than existing measurements, as determined statistically using pseudo datasets of known values. This method provides an important and reliable tool for fluorescent 3D neurobiological studies, and will benefit other biological studies which measure fluorescence co-localisation in 3D.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The range of potential applications for indoor and campus based personnel localisation has led researchers to create a wide spectrum of different algorithmic approaches and systems. However, the majority of the proposed systems overlook the unique radio environment presented by the human body leading to systematic errors and inaccuracies when deployed in this context. In this paper RSSI-based Monte Carlo Localisation was implemented using commercial 868 MHz off the shelf hardware and empirical data was gathered across a relatively large number of scenarios within a single indoor office environment. This data showed that the body shadowing effect caused by the human body introduced path skew into location estimates. It was also shown that, by using two body-worn nodes in concert, the effect of body shadowing can be mitigated by averaging the estimated position of the two nodes worn on either side of the body. © Springer Science+Business Media, LLC 2012.