4 resultados para data hiding

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Personal use is permitted.We present a novel framework of performing multimedia data hiding using an over-complete dictionary, which brings compressive sensing to the application of data hiding. Unlike the conventional orthonormal full-space dictionary, the over-complete dictionary produces an underdetermined system with infinite transform results. We first discuss the minimum norm formulation (ℓ2-norm) which yields a closed-form solution and the concept of watermark projection, so that higher embedding capacity and an additional privacy preserving feature can be obtained. Furthermore, we study the sparse formulation (ℓ0-norm) and illustrate that as long as the ℓ0-norm of the sparse representation of the host signal is less than the signal's dimension in the original domain, an informed sparse domain data hiding system can be established by modifying the coefficients of the atoms that have not participated in representing the host signal. A single support modification-based data hiding system is then proposed and analyzed as an example. Several potential research directions are discussed for further studies. More generally, apart from the ℓ2 and ℓ0-norm constraints, other conditions for reliable detection performance are worth of future investigation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Surveillance applications in private environments such as smart houses require a privacy management policy if such systems are to be accepted by the occupants of the environment. This is due to the invasive nature of surveillance, and the private nature of the home. In this article, we propose a framework for dynamically altering the privacy policy applied to the monitoring of a smart house based on the situation within the environment. Initially the situation, or context, within the environment is determined; we identify several factors for determining environmental context, and propose methods to quantify the context using audio and binary sensor data. The context is then mapped to an appropriate privacy policy, which is implemented by applying data hiding techniques to control access to data gathered from various information sources. The significance of this work lies in the examination of privacy issues related to assisted-living smart house environments. A single privacy policy in such applications would be either too restrictive for an observer, for example, a carer, or too invasive for the occupants. We address this by proposing a dynamic method, with the aim of decreasing the invasiveness of the technology, while retaining the purpose of the system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A smart house can be regarded as a surveillance environment in which the person being observed carries out activities that range from intimate to more public. What can be observed depends on the activity, the person observing (e.g. a carer) and policy. In assisted living smart house environments, a single privacy policy, applied throughout, would be either too invasive for an occupant, or too restrictive for an observer, due to the conflicting goals of surveillance and private environments. Hence, we propose a dynamic method for altering the level of privacy in the environment based on the context, the situation within the environment, encompassing factors relevant to ensuring the occupant's safety and privacy. The context is mapped to an appropriate level of privacy, which is implemented by controlling access to data sources (e.g. video) using data hiding techniques. The aim of this work is to decrease the invasiveness of the technology, while retaining the purpose of the system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Privacy preserving on data mining and data release has attracted an increasing research interest over a number of decades. Differential privacy is one influential privacy notion that offers a rigorous and provable privacy guarantee for data mining and data release. Existing studies on differential privacy assume that in a data set, records are sampled independently. However, in real-world applications, records in a data set are rarely independent. The relationships among records are referred to as correlated information and the data set is defined as correlated data set. A differential privacy technique performed on a correlated data set will disclose more information than expected, and this is a serious privacy violation. Although recent research was concerned with this new privacy violation, it still calls for a solid solution for the correlated data set. Moreover, how to decrease the large amount of noise incurred via differential privacy in correlated data set is yet to be explored. To fill the gap, this paper proposes an effective correlated differential privacy solution by defining the correlated sensitivity and designing a correlated data releasing mechanism. With consideration of the correlated levels between records, the proposed correlated sensitivity can significantly decrease the noise compared with traditional global sensitivity. The correlated data releasing mechanism correlated iteration mechanism is designed based on an iterative method to answer a large number of queries. Compared with the traditional method, the proposed correlated differential privacy solution enhances the privacy guarantee for a correlated data set with less accuracy cost. Experimental results show that the proposed solution outperforms traditional differential privacy in terms of mean square error on large group of queries. This also suggests the correlated differential privacy can successfully retain the utility while preserving the privacy.