971 resultados para Sensitive Data


Relevância:

70.00% 70.00%

Publicador:

Resumo:

A substantial reform of data protection law is on the agenda of the European Commission as it is widely agreed that data protection law is faced by lots of challenges, due to fundamental technical and social changes or even revolutions. Therefore, the authors have issued draft new provisions on data protection law that would work in both Germany and Europe. The draft is intended to provide a new approach and deal with the consequences of such an approach. This article contains some key theses on the main legislatory changes that appear both necessary and adequate.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Data leakage is a serious issue and can result in the loss of sensitive data, compromising user accounts and details, potentially affecting millions of internet users. This paper contributes to research in online security and reducing personal footprint by evaluating the levels of privacy provided by the Firefox browser. The aim of identifying conditions that would minimize data leakage and maximize data privacy is addressed by assessing and comparing data leakage in the four possible browsing modes: normal and private modes using a browser installed on the host PC or using a portable browser from a connected USB device respectively. To provide a firm foundation for analysis, a series of carefully designed, pre-planned browsing sessions were repeated in each of the various modes of Firefox. This included low RAM environments to determine any effects low RAM may have on browser data leakage. The results show that considerable data leakage may occur within Firefox. In normal mode, all of the browsing information is stored within the Mozilla profile folder in Firefox-specific SQLite databases and sessionstore.js. While passwords were not stored as plain text, other confidential information such as credit card numbers could be recovered from the Form history under certain conditions. There is no difference when using a portable browser in normal mode, except that the Mozilla profile folder is located on the USB device rather than the host's hard disk. By comparison, private browsing reduces data leakage. Our findings confirm that no information is written to the Firefox-related locations on the hard disk or USB device during private browsing, implying that no deletion would be necessary and no remnants of data would be forensically recoverable from unallocated space. However, two aspects of data leakage occurred equally in all four browsing modes. Firstly, all of the browsing history was stored in the live RAM and was therefore accessible while the browser remained open. Secondly, in low RAM situations, the operating system caches out RAM to pagefile.sys on the host's hard disk. Irrespective of the browsing mode used, this may include Firefox history elements which can then remain forensically recoverable for considerable time.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In recent years, there has been exponential growth in using virtual spaces, including dialogue systems, that handle personal information. The concept of personal privacy in the literature is discussed and controversial, whereas, in the technological field, it directly influences the degree of reliability perceived in the information system (privacy ‘as trust’). This work aims to protect the right to privacy on personal data (GDPR, 2018) and avoid the loss of sensitive content by exploring sensitive information detection (SID) task. It is grounded on the following research questions: (RQ1) What does sensitive data mean? How to define a personal sensitive information domain? (RQ2) How to create a state-of-the-art model for SID?(RQ3) How to evaluate the model? RQ1 theoretically investigates the concepts of privacy and the ontological state-of-the-art representation of personal information. The Data Privacy Vocabulary (DPV) is the taxonomic resource taken as an authoritative reference for the definition of the knowledge domain. Concerning RQ2, we investigate two approaches to classify sensitive data: the first - bottom-up - explores automatic learning methods based on transformer networks, the second - top-down - proposes logical-symbolic methods with the construction of privaframe, a knowledge graph of compositional frames representing personal data categories. Both approaches are tested. For the evaluation - RQ3 – we create SPeDaC, a sentence-level labeled resource. This can be used as a benchmark or training in the SID task, filling the gap of a shared resource in this field. If the approach based on artificial neural networks confirms the validity of the direction adopted in the most recent studies on SID, the logical-symbolic approach emerges as the preferred way for the classification of fine-grained personal data categories, thanks to the semantic-grounded tailor modeling it allows. At the same time, the results highlight the strong potential of hybrid architectures in solving automatic tasks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is no doubt about the necessity of protecting digital communication: Citizens are entrusting their most confidential and sensitive data to digital processing and communication, and so do governments, corporations, and armed forces. Digital communication networks are also an integral component of many critical infrastructures we are seriously depending on in our daily lives. Transportation services, financial services, energy grids, food production and distribution networks are only a few examples of such infrastructures. Protecting digital communication means protecting confidentiality and integrity by encrypting and authenticating its contents. But most digital communication is not secure today. Nevertheless, some of the most ardent problems could be solved with a more stringent use of current cryptographic technologies. Quite surprisingly, a new cryptographic primitive emerges from the ap-plication of quantum mechanics to information and communication theory: Quantum Key Distribution. QKD is difficult to understand, it is complex, technically challenging, and costly-yet it enables two parties to share a secret key for use in any subsequent cryptographic task, with an unprecedented long-term security. It is disputed, whether technically and economically fea-sible applications can be found. Our vision is, that despite technical difficulty and inherent limitations, Quantum Key Distribution has a great potential and fits well with other cryptographic primitives, enabling the development of highly secure new applications and services. In this thesis we take a structured approach to analyze the practical applicability of QKD and display several use cases of different complexity, for which it can be a technology of choice, either because of its unique forward security features, or because of its practicability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L'attaque de retransmission sélective est une menace sérieuse dans les réseaux de capteurs sans fil (WSN), en particulier dans les systèmes de surveillance. Les noeuds peuvent supprimer de manière malicieuse certains paquets de données sensibles, ce qui risque de détruire la valeur des données assemblées dans le réseau et de diminuer la disponibilité des services des capteurs. Nous présentons un système de sécurité léger basé sur l'envoi de faux rapports pour identifier les attaques de retransmission sélective après avoir montré les inconvénients des systèmes existants. Le grand avantage de notre approche est que la station de base attend une séquence de faux paquets à un moment précis sans avoir communiqué avec les noeuds du réseau. Par conséquent, elle sera capable de détecter une perte de paquets. L'analyse théorique montre que le système proposé peut identifier ce type d'attaque et peut alors améliorer la robustesse du réseau dans des conditions d'un bon compromis entre la fiabilité de la sécurité et le coût de transmission. Notre système peut atteindre un taux de réussite élevé d‟identification face à un grand nombre de noeuds malicieux, tandis que le coût de transmission peut être contrôlé dans des limites raisonnables.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L'Italie a été l'avant-dernier pays européen, suivi seulement de la Grèce, à se doter d'une loi sur la protection de la vie privée (loi du 31 décembre 1996). Paradoxalement, c'est en Italie qu'ont été écrites quelques-uns des meilleurs ouvrages sur ce sujet, notamment ceux du professeur Rodotà. En dépit du retard du législateur italien, il doit être précisé que la loi de 1996, faisant suite à la Directive communautaire relative à la protection des données personnelles, introduit un concept moderne de la vie privée, qui ne se limite pas simplement à un « right to be let alone », selon la célèbre conception de la fin du dix-neuvième siècle, mais qui se réfère plutôt à la protection de la personne humaine. Le concept de vie privée, entendu comme l’interdiction d’accéder à des informations personnelles, se transforme en un contrôle des renseignements relatifs à la personne. De cette manière, se développe une idée de la vie privée qui pose comme fondements : le droit de contrôle, de correction et d'annulation d'informations sur la personne. À cet égard, il est important de souligner le double système d’autorisation pour le traitement licite des informations. Le consentement de l'intéressé est requis pour les données personnelles. Pour les données dites « sensibles », en revanche, l'autorisation du Garant sera nécessaire en plus de l'expression du consentement de l’intéressé. En revanche, aucune autorisation n'est requise pour le traitement de données n'ayant qu'un but exclusivement personnel, ainsi que pour les données dites « anonymes », à condition qu'elles ne permettent pas d'identifier le sujet concerné. Le type de responsabilité civile prévu par la loi de 1996 se révèle particulièrement intéressant : l'article 18 prévoit l'application de l'article 2050 du Code civil italien (exercice d'activités dangereuses), alors que l'article 29 prévoit, lui, l'octroi de dommages et intérêts pour les préjudices non patrimoniaux (cette disposition est impérative, conformément à l'article 2059 du Code civil italien). Le présent article se propose d'examiner l'application des normes évoquées ci-dessus à Internet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

n the recent years protection of information in digital form is becoming more important. Image and video encryption has applications in various fields including Internet communications, multimedia systems, medical imaging, Tele-medicine and military communications. During storage as well as in transmission, the multimedia information is being exposed to unauthorized entities unless otherwise adequate security measures are built around the information system. There are many kinds of security threats during the transmission of vital classified information through insecure communication channels. Various encryption schemes are available today to deal with information security issues. Data encryption is widely used to protect sensitive data against the security threat in the form of “attack on confidentiality”. Secure transmission of information through insecure communication channels also requires encryption at the sending side and decryption at the receiving side. Encryption of large text message and image takes time before they can be transmitted, causing considerable delay in successive transmission of information in real-time. In order to minimize the latency, efficient encryption algorithms are needed. An encryption procedure with adequate security and high throughput is sought in multimedia encryption applications. Traditional symmetric key block ciphers like Data Encryption Standard (DES), Advanced Encryption Standard (AES) and Escrowed Encryption Standard (EES) are not efficient when the data size is large. With the availability of fast computing tools and communication networks at relatively lower costs today, these encryption standards appear to be not as fast as one would like. High throughput encryption and decryption are becoming increasingly important in the area of high-speed networking. Fast encryption algorithms are needed in these days for high-speed secure communication of multimedia data. It has been shown that public key algorithms are not a substitute for symmetric-key algorithms. Public key algorithms are slow, whereas symmetric key algorithms generally run much faster. Also, public key systems are vulnerable to chosen plaintext attack. In this research work, a fast symmetric key encryption scheme, entitled “Matrix Array Symmetric Key (MASK) encryption” based on matrix and array manipulations has been conceived and developed. Fast conversion has been achieved with the use of matrix table look-up substitution, array based transposition and circular shift operations that are performed in the algorithm. MASK encryption is a new concept in symmetric key cryptography. It employs matrix and array manipulation technique using secret information and data values. It is a block cipher operated on plain text message (or image) blocks of 128 bits using a secret key of size 128 bits producing cipher text message (or cipher image) blocks of the same size. This cipher has two advantages over traditional ciphers. First, the encryption and decryption procedures are much simpler, and consequently, much faster. Second, the key avalanche effect produced in the ciphertext output is better than that of AES.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ever increasing spurt in digital crimes such as image manipulation, image tampering, signature forgery, image forgery, illegal transaction, etc. have hard pressed the demand to combat these forms of criminal activities. In this direction, biometrics - the computer-based validation of a persons' identity is becoming more and more essential particularly for high security systems. The essence of biometrics is the measurement of person’s physiological or behavioral characteristics, it enables authentication of a person’s identity. Biometric-based authentication is also becoming increasingly important in computer-based applications because the amount of sensitive data stored in such systems is growing. The new demands of biometric systems are robustness, high recognition rates, capability to handle imprecision, uncertainties of non-statistical kind and magnanimous flexibility. It is exactly here that, the role of soft computing techniques comes to play. The main aim of this write-up is to present a pragmatic view on applications of soft computing techniques in biometrics and to analyze its impact. It is found that soft computing has already made inroads in terms of individual methods or in combination. Applications of varieties of neural networks top the list followed by fuzzy logic and evolutionary algorithms. In a nutshell, the soft computing paradigms are used for biometric tasks such as feature extraction, dimensionality reduction, pattern identification, pattern mapping and the like.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Attacks to devices connected to networks are one of the main problems related to the confidentiality of sensitive data and the correct functioning of computer systems. In spite of the availability of tools and procedures that harden or prevent the occurrence of security incidents, network devices are successfully attacked using strategies applied in previous events. The lack of knowledge about scenarios in which these attacks occurred effectively contributes to the success of new attacks. The development of a tool that makes this kind of information available is, therefore, of great relevance. This work presents a support system to the management of corporate security for the storage, retrieval and help in constructing attack scenarios and related information. If an incident occurs in a corporation, an expert must access the system to store the specific attack scenario. This scenario, made available through controlled access, must be analyzed so that effective decisions or actions can be taken for similar cases. Besides the strategy used by the attacker, attack scenarios also exacerbate vulnerabilities in devices. The access to this kind of information contributes to an increased security level of a corporation's network devices and a decreased response time to occurring incidents

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis main topic is the conflict between disclosure in financial markets and the need for confidentiality of the firm. After a recognition of the major dynamics of information production and dissemination in the stock market, the analysis moves to the interactions between the information that a firm is tipically interested in keeping confidential, such as trade secrets or the data usually covered by patent protection, and the countervailing demand for disclosure arising from finacial markets. The analysis demonstrates that despite the seeming divergence between informational contents tipically disclosed to investors and information usually covered by intellectual property protection, the overlapping areas are nonetheless wide and the conflict between transparency in financial markets and the firm’s need for confidentiality arises frequently and sistematically. Indeed, the company’s disclosure policy is based on a continuous trade-off between the costs and the benefits related to the public dissemination of information. Such costs are mainly represented by the competitive harm caused by competitors’ access to sensitive data, while the benefits mainly refer to the lower cost of capital that the firm obtains as a consequence of more disclosure. Secrecy shields the value of costly produced information against third parties’ free riding and constitutes therefore a means to protect the firm’s incentives toward the production of new information and especially toward technological and business innovation. Excessively demanding standards of transparency in financial markets might hinder such set of incentives and thus jeopardize the dynamics of innovation production. Within Italian securities regulation, there are two sets of rules mostly relevant with respect to such an issue: the first one is the rule that mandates issuers to promptly disclose all price-sensitive information to the market on an ongoing basis; the second one is the duty to disclose in the prospectus all the information “necessary to enable investors to make an informed assessment” of the issuers’ financial and economic perspectives. Both rules impose high disclosure standards and have potentially unlimited scope. Yet, they have safe harbours aimed at protecting the issuer need for confidentiality. Despite the structural incompatibility between public dissemination of information and the firm’s need to keep certain data confidential, there are certain ways to convey information to the market while preserving at the same time the firm’s need for confidentality. Such means are insider trading and selective disclosure: both are based on mechanics whereby the process of price reaction to the new information takes place without any corresponding activity of public release of data. Therefore, they offer a solution to the conflict between disclosure and the need for confidentiality that enhances market efficiency and preserves at the same time the private set of incentives toward innovation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents first a study of the national and international laws in the fields of safety, security and safeguards. The international treaties and the recommendations issued by the IAEA as well as the national regulations in force in France, the United States and Italy are analyzed. As a result of this, a comparison among them is presented. Given the interest of the Japan Atomic Energy Agency for the aspects of criminal penalties and monetary, also the Japanese case is analyzed. The main part of this work was held at the JAEA in the field of proliferation resistance (PR) and physical protection (PP) of a GEN IV sodium fast reactor. For this purpose the design of the system is completed and the PR & PP methodology is applied to obtain data usable by designers for the improvement of the system itself. Due to the presence of sensitive data, not all the details can be disclosed. The reactor site of a hypothetical and commercial sodium-cooled fast neutron nuclear reactor system (SFR) is used as the target NES for the application of the methodology. The methodology is applied to all the PR and PP scenarios: diversion, misuse and breakout; theft and sabotage. The methodology is applied to the SFR to check if this system meets the target of PR and PP as described in the GIF goal; secondly, a comparison between the SFR and a LWR is performed to evaluate if and how it would be possible to improve the PR&PP of the SFR. The comparison is implemented according to the example development target: achieving PR&PP similar or superior to domestic and international ALWR. Three main actions were performed: implement the evaluation methodology; characterize the PR&PP for the nuclear energy system; identify recommendations for system designers through the comparison.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

High energy gamma rays can provide fundamental clues to the origins of cosmic rays. In this thesis, TeV gamma-ray emission from the Cygnus region is studied. Previously the Milagro experiment detected five TeV gamma-ray sources in this region and a significant excess of TeV gamma rays whose origin is still unclear. To better understand the diffuse excess the separation of sources and diffuse emission is studied using the latest and most sensitive data set of the Milagro experiment. In addition, a newly developed technique is applied that allows the energy spectrum of the TeV gamma rays to be reconstructed using Milagro data. No conclusive statement can be made about the spectrum of the diffuse emission from the Cygnus region because of its low significance of 2.2 σ above the background in the studied data sample. The entire Cygnus region emission is best fit with a power law with a spectral index of α=2.40 (68% confidence interval: 1.35-2.92) and a exponential cutoff energy of 31.6 TeV (10.0-251.2 TeV). In the case of a simple power law assumption without a cutoff energy the best fit yields a spectral index of α=2.97 (68% confidence interval: 2.83-3.10). Neither of these best fits are in good agreement with the data. The best spectral fit to the TeV emission from MGRO J2019+37, the brightest source in the Cygnus region, yields a spectral index of α=2.30 (68% confidence interval: 1.40-2.70) with a cutoff energy of 50.1 TeV (68% confidence interval: 17.8-251.2 TeV) and a spectral index of α=2.75 (68% confidence interval: 2.65-2.85) when no exponential cutoff energy is assumed. According to the present analysis, MGRO J2019+37 contributes 25% to the differential flux from the entire Cygnus at 15 TeV.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With today's prevalence of Internet-connected systems storing sensitive data and the omnipresent threat of technically skilled malicious users, computer security remains a critically important field. Because of today's multitude of vulnerable systems and security threats, it is vital that computer science students be taught techniques for programming secure systems, especially since many of them will work on systems with sensitive data after graduation. Teaching computer science students proper design, implementation, and maintenance of secure systems is a challenging task that calls for the use of novel pedagogical tools. This report describes the implementation of a compiler that converts mandatory access control specification Domain-Type Enforcement Language to the Java Security Manager, primarily for pedagogical purposes. The implementation of the Java Security Manager was explored in depth, and various techniques to work around its inherent limitations were explored and partially implemented, although some of these workarounds do not appear in the current version of the compiler because they would have compromised cross-platform compatibility. The current version of the compiler and implementation details of the Java Security Manager are discussed in depth.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este artículo intenta mostrar cómo la introducción del corpus aristotélico en el mundo cristiano medieval durante los siglos XII y XIII contribuyó notablemente a reivindicar el valor de los datos sensibles para conducir al conocimiento inteligible. En efecto, el platonismo con el que los primeros pensadores cristianos estuvieron bien familiarizados, negaba que lo sensible pudiera dar lugar a un verdadero conocimiento. Sin embargo, esto significaba, al mismo tiempo, que las cosas sensibles no tenían suficiente consistencia ontológica. Y puesto que el cristianismo enseñaba la dignidad de todo lo creado, la filosofía aristotélica vino a proveerle de una concepción de lo sensible mucho más afín con sus propios principios. Esta confianza en la realidad concreta como objeto de conocimiento incluso inteligible acabó, no obstante, hacia fines de la Edad Media, y con ella, el realismo gnoseológico característico del pensamiento cristiano medieval.