955 resultados para Sensitive information


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Today's wireless networks rely mostly on infrastructural support for their operation. With the concept of ubiquitous computing growing more popular, research on infrastructureless networks have been rapidly growing. However, such types of networks face serious security challenges when deployed. This dissertation focuses on designing a secure routing solution and trust modeling for these infrastructureless networks. ^ The dissertation presents a trusted routing protocol that is capable of finding a secure end-to-end route in the presence of malicious nodes acting either independently or in collusion, The solution protects the network from active internal attacks, known to be the most severe types of attacks in an ad hoc application. Route discovery is based on trust levels of the nodes, which need to be dynamically computed to reflect the malicious behavior in the network. As such, we have developed a trust computational model in conjunction with the secure routing protocol that analyzes the different malicious behavior and quantifies them in the model itself. Our work is the first step towards protecting an ad hoc network from colluding internal attack. To demonstrate the feasibility of the approach, extensive simulation has been carried out to evaluate the protocol efficiency and scalability with both network size and mobility. ^ This research has laid the foundation for developing a variety of techniques that will permit people to justifiably trust the use of ad hoc networks to perform critical functions, as well as to process sensitive information without depending on any infrastructural support and hence will enhance the use of ad hoc applications in both military and civilian domains. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reliability and sensitive information protection are critical aspects of integrated circuits. A novel technique using near-field evanescent wave coupling from two subwavelength gratings (SWGs), with the input laser source delivered through an optical fiber is presented for tamper evidence of electronic components. The first grating of the pair of coupled subwavelength gratings (CSWGs) was milled directly on the output facet of the silica fiber using focused ion beam (FIB) etching. The second grating was patterned using e-beam lithography and etched into a glass substrate using reactive ion etching (RIE). The slightest intrusion attempt would separate the CSWGs and eliminate near-field coupling between the gratings. Tampering, therefore, would become evident. Computer simulations guided the design for optimal operation of the security solution. The physical dimensions of the SWGs, i.e. period and thickness, were optimized, for a 650 nm illuminating wavelength. The optimal dimensions resulted in a 560 nm grating period for the first grating etched in the silica optical fiber and 420 nm for the second grating etched in borosilicate glass. The incident light beam had a half-width at half-maximum (HWHM) of at least 7 µm to allow discernible higher transmission orders, and a HWHM of 28 µm for minimum noise. The minimum number of individual grating lines present on the optical fiber facet was identified as 15 lines. Grating rotation due to the cylindrical geometry of the fiber resulted in a rotation of the far-field pattern, corresponding to the rotation angle of moiré fringes. With the goal of later adding authentication to tamper evidence, the concept of CSWGs signature was also modeled by introducing random and planned variations in the glass grating. The fiber was placed on a stage supported by a nanomanipulator, which permitted three-dimensional displacement while maintaining the fiber tip normal to the surface of the glass substrate. A 650 nm diode laser was fixed to a translation mount that transmitted the light source through the optical fiber, and the output intensity was measured using a silicon photodiode. The evanescent wave coupling output results for the CSWGs were measured and compared to the simulation results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Collaborative sharing of information is becoming much more needed technique to achieve complex goals in today's fast-paced tech-dominant world. Personal Health Record (PHR) system has become a popular research area for sharing patients informa- tion very quickly among health professionals. PHR systems store and process sensitive information, which should have proper security mechanisms to protect patients' private data. Thus, access control mechanisms of the PHR should be well-defined. Secondly, PHRs should be stored in encrypted form. Cryptographic schemes offering a more suitable solution for enforcing access policies based on user attributes are needed for this purpose. Attribute-based encryption can resolve these problems, we propose a patient-centric framework that protects PHRs against untrusted service providers and malicious users. In this framework, we have used Ciphertext Policy Attribute Based Encryption scheme as an efficient cryptographic technique, enhancing security and privacy of the system, as well as enabling access revocation. Patients can encrypt their PHRs and store them on untrusted storage servers. They also maintain full control over access to their PHR data by assigning attribute-based access control to selected data users, and revoking unauthorized users instantly. In order to evaluate our system, we implemented CP-ABE library and web services as part of our framework. We also developed an android application based on the framework that allows users to register into the system, encrypt their PHR data and upload to the server, and at the same time authorized users can download PHR data and decrypt it. Finally, we present experimental results and performance analysis. It shows that the deployment of the proposed system would be practical and can be applied into practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract

Continuous variable is one of the major data types collected by the survey organizations. It can be incomplete such that the data collectors need to fill in the missingness. Or, it can contain sensitive information which needs protection from re-identification. One of the approaches to protect continuous microdata is to sum them up according to different cells of features. In this thesis, I represents novel methods of multiple imputation (MI) that can be applied to impute missing values and synthesize confidential values for continuous and magnitude data.

The first method is for limiting the disclosure risk of the continuous microdata whose marginal sums are fixed. The motivation for developing such a method comes from the magnitude tables of non-negative integer values in economic surveys. I present approaches based on a mixture of Poisson distributions to describe the multivariate distribution so that the marginals of the synthetic data are guaranteed to sum to the original totals. At the same time, I present methods for assessing disclosure risks in releasing such synthetic magnitude microdata. The illustration on a survey of manufacturing establishments shows that the disclosure risks are low while the information loss is acceptable.

The second method is for releasing synthetic continuous micro data by a nonstandard MI method. Traditionally, MI fits a model on the confidential values and then generates multiple synthetic datasets from this model. Its disclosure risk tends to be high, especially when the original data contain extreme values. I present a nonstandard MI approach conditioned on the protective intervals. Its basic idea is to estimate the model parameters from these intervals rather than the confidential values. The encouraging results of simple simulation studies suggest the potential of this new approach in limiting the posterior disclosure risk.

The third method is for imputing missing values in continuous and categorical variables. It is extended from a hierarchically coupled mixture model with local dependence. However, the new method separates the variables into non-focused (e.g., almost-fully-observed) and focused (e.g., missing-a-lot) ones. The sub-model structure of focused variables is more complex than that of non-focused ones. At the same time, their cluster indicators are linked together by tensor factorization and the focused continuous variables depend locally on non-focused values. The model properties suggest that moving the strongly associated non-focused variables to the side of focused ones can help to improve estimation accuracy, which is examined by several simulation studies. And this method is applied to data from the American Community Survey.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Internal control is something that’s grown more important for enterprises to keep in mind. The community is increasingly affected by the IT-development which demands a bigger degree of security. Enterprises needs to make sure that their systems are up to date and secure enough to keep it safe from unauthorized to take part of sensitive information. Internal control can exist in a major part of the work. If an enterprise have a goal for no harm or serious injury at work, internal control is necessary to reach that goal. The purpose for this essay is to examine how five different departments of Trafikverket practices internal control. How internal control is described. How the guidance from the managements is described and how it reaches the rest of the enterprise. This will lead to a proposal of improvement of the internal control at Trafikverket. We focus our frame of reference on the COSO-model and its five components. The components included in the COSO-model are control environment, risk valuation, control activities, information and communication and monitoring. The essay is a case-study of Trafikverket. We have chosen a qualitative method and interviewed five respondents from the different departments on Trafikverket. The respondents we interviewed works with internal control in their everyday work or have a god insight in the subject. We used a semi structured interview guide with questions based on the COSO framework. The results from our study shows that it exist big variations between how the departments work with internal control. It emerged that there are new guidelines for how the work should be done. This makes it necessary with education to implement the new ways to work. How the departments use the COSO-model varies. Some of them have incorporated the model in their new ways to work others have never heard of it. The conclusion of our study shows that the COSO-model and it´s components contribute to a functioning internal control. Implementing the components is important and the most important feature to good internal control is the corporate management. Education within the enterprise is the most effective way to inform the staff about the model and to implement it.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The use of secondary data in health care research has become a very important issue over the past few years. Data from the treatment context are being used for evaluation of medical data for external quality assurance, as well as to answer medical questions in the form of registers and research databases. Additionally, the establishment of electronic clinical systems like data warehouses provides new opportunities for the secondary use of clinical data. Because health data is among the most sensitive information about an individual, the data must be safeguarded from disclosure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reverse engineering is usually the stepping stone of a variety of at-tacks aiming at identifying sensitive information (keys, credentials, data, algo-rithms) or vulnerabilities and flaws for broader exploitation. Software applica-tions are usually deployed as identical binary code installed on millions of com-puters, enabling an adversary to develop a generic reverse-engineering strategy that, if working on one code instance, could be applied to crack all the other in-stances. A solution to mitigate this problem is represented by Software Diversity, which aims at creating several structurally different (but functionally equivalent) binary code versions out of the same source code, so that even if a successful attack can be elaborated for one version, it should not work on a diversified ver-sion. In this paper, we address the problem of maximizing software diversity from a search-based optimization point of view. The program to protect is subject to a catalogue of transformations to generate many candidate versions. The problem of selecting the subset of most diversified versions to be deployed is formulated as an optimisation problem, that we tackle with different search heuristics. We show the applicability of this approach on some popular Android apps.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Individuals living in highly networked societies publish a large amount of personal, and potentially sensitive, information online. Web investigators can exploit such information for a variety of purposes, such as in background vetting and fraud detection. However, such investigations require a large number of expensive man hours and human effort. This paper describes InfoScout, a search tool which is intended to reduce the time it takes to identify and gather subject centric information on the Web. InfoScout collects relevance feedback information from the investigator in order to rerank search results, allowing the intended information to be discovered more quickly. Users may still direct their search as they see fit, issuing ad-hoc queries and filtering existing results by keywords. Design choices are informed by prior work and industry collaboration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reports on the use of non-symbolic fragmentation of data for securing communications. Non-symbolic fragmentation, or NSF, relies on breaking up data into non-symbolic fragments, which are (usually irregularly-sized) chunks whose boundaries do not necessarily coincide with the boundaries of the symbols making up the data. For example, ASCII data is broken up into fragments which may include 8-bit fragments but also include many other sized fragments. Fragments are then separated with a form of path diversity. The secrecy of the transmission relies on the secrecy of one or more of a number of things: the ordering of the fragments, the sizes of the fragments, and the use of path diversity. Once NSF is in place, it can help secure many forms of communication, and is useful for exchanging sensitive information, and for commercial transactions. A sample implementation is described with an evaluation of the technology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Esta tese investiga se a composição do endividamento dos bancos afeta sua política de dividendos. Identificou-se que investidores sensíveis a informações (investidores institucionais) são alvos de sinalização através de dividendos por parte dos bancos. Utilizando uma base de dados exclusiva de bancos brasileiros, foi possível identificar vários tipos de credores, especificamente, investidores institucionais, empresas não financeiras e pessoas físicas, que são alvos potenciais de sinalização por dividendos. Adicionalmente, a existência de vários bancos de capital fechado, controlados e geridos por um pequeno grupo de acionistas, em que a sinalização direcionada a acionistas é implausível, permite inferir que bancos que utilizam mais fundos de investidores sensíveis a informações (institucionais) pagam mais dividendos, controlando por diversas características. Durante a crise financeira, este comportamento foi ainda mais pronunciado. Esta relação reforça o papel dos dividendos como uma forma custosa e crível de comunicar sobre a qualidade dos ativos dos bancos. A hipótese de que os dividendos podem ser utilizados como uma forma de expropriação dos depositantes por parte dos acionistas é refutada, uma vez que, se fosse esse o caso, observar-se-ia esse maiores dividendos em bancos com depositantes menos sensíveis a informação. Além disso, foi verificada uma relação negativa entre o pagamento de dividendos e o custo de captação (juros pagos em certificados de depósito bancário) e uma relação positiva de dividendos com o tamanho e com os lucros passados, e que os bancos de capital fechado pagam mais dividendos do que os de capital aberto, uma descoberta que também se alinha com a ideia de que os depositantes seriam os alvos da sinalização por dividendos. Finalmente, encontrou-se também uma relação negativa entre dividendos e adequação de capital do bancos, o que indica que pressões regulatórias podem induzir os bancos a pagar menos dividendos e que o pagamento de dividendos é negativamente relacionado com o crescimento da carteira de crédito, o que é consistente com a ideia de que os bancos com maiores oportunidades de investimento retêm seus lucros para aumentar seu patrimônio líquido e sua capacidade de conceder crédito.