930 resultados para data anonymization mechanism
Resumo:
Calreticulin is a lectin-like molecular chaperone of the endoplasmic reticulum in eukaryotes. Its interaction with N-glycosylated polypeptides is mediated by the glycan, Glc(1)Man(9)GlcNAc(2), present on the target glycoproteins. In this work, binding of monoglucosyl IgG (chicken) substrate to calreticulin has been studied using real time association kinetics of the interaction with the biosensor based on surface plasmon resonance (SPR). By SPR, accurate association and dissociation rate constants were determined, and these yielded a micromolar association constant. The nature of reaction was unaffected by immobilization of either of the reactants. The Scatchard analysis values for K-a agreed web crith the one obtained by the ratio k(1)/k(-1). The interaction was completely inhibited by free oligosaccharide, Glc(1)Man(9)GlcNAc(2), whereas Man(9)GlcNAc(2) did not bind to the calreticulin-substrate complex, attesting to the exquisite specificity of this interaction. The binding of calreticulin to IgG was used for the development of immunoassay and the relative affinity of the lectin-substrate association was indirectly measured. The values are in agreement with those obtained with SPR. Although the reactions are several orders of magnitude slower than the diffusion controlled processes, the data are qualitatively and quantitatively consistent with single-step bimolecular association and dissociation reaction. Analyses of the activation parameters indicate that reaction is enthalpically driven and does not involve a highly ordered transition state. Based on these data, the mechanism of its chaperone activity is briefly discussed.
Resumo:
Calreticulin is a lectin-like molecular chaperone of the endoplasmic reticulum in eukaryotes. Its interaction with N-glycosylated polypeptides is mediated by the glycan, Glc(1)Man(9)GlcNAc(2), present on the target glycoproteins. In this work, binding of monoglucosyl IgG (chicken) substrate to calreticulin has been studied using real time association kinetics of the interaction with the biosensor based on surface plasmon resonance (SPR). By SPR, accurate association and dissociation rate constants were determined, and these yielded a micromolar association constant. The nature of reaction was unaffected by immobilization of either of the reactants. The Scatchard analysis values for K-a agreed web crith the one obtained by the ratio k(1)/k(-1). The interaction was completely inhibited by free oligosaccharide, Glc(1)Man(9)GlcNAc(2), whereas Man(9)GlcNAc(2) did not bind to the calreticulin-substrate complex, attesting to the exquisite specificity of this interaction. The binding of calreticulin to IgG was used for the development of immunoassay and the relative affinity of the lectin-substrate association was indirectly measured. The values are in agreement with those obtained with SPR. Although the reactions are several orders of magnitude slower than the diffusion controlled processes, the data are qualitatively and quantitatively consistent with single-step bimolecular association and dissociation reaction. Analyses of the activation parameters indicate that reaction is enthalpically driven and does not involve a highly ordered transition state. Based on these data, the mechanism of its chaperone activity is briefly discussed.
Resumo:
In this paper, we propose a Loss Tolerant Reliable (LTR) data transport mechanism for dynamic Event Sensing (LTRES) in WSNs. In LTRES, a reliable event sensing requirement at the transport layer is dynamically determined by the sink. A distributed source rate adaptation mechanism is designed, incorporating a loss rate based lightweight congestion control mechanism, to regulate the data traffic injected into the network so that the reliability requirement can be satisfied. An equation based fair rate control algorithm is used to improve the fairness among the LTRES flows sharing the congestion path. The performance evaluations show that LTRES can provide LTR data transport service for multiple events with short convergence time, low lost rate and high overall bandwidth utilization.
Resumo:
The deployment of wireless communications coupled with the popularity of portable devices has led to significant research in the area of mobile data caching. Prior research has focused on the development of solutions that allow applications to run in wireless environments using proxy based techniques. Most of these approaches are semantic based and do not provide adequate support for representing the context of a user (i.e., the interpreted human intention.). Although the context may be treated implicitly it is still crucial to data management. In order to address this challenge this dissertation focuses on two characteristics: how to predict (i) the future location of the user and (ii) locations of the fetched data where the queried data item has valid answers. Using this approach, more complete information about the dynamics of an application environment is maintained. ^ The contribution of this dissertation is a novel data caching mechanism for pervasive computing environments that can adapt dynamically to a mobile user's context. In this dissertation, we design and develop a conceptual model and context aware protocols for wireless data caching management. Our replacement policy uses the validity of the data fetched from the server and the neighboring locations to decide which of the cache entries is less likely to be needed in the future, and therefore a good candidate for eviction when cache space is needed. The context aware driven prefetching algorithm exploits the query context to effectively guide the prefetching process. The query context is defined using a mobile user's movement pattern and requested information context. Numerical results and simulations show that the proposed prefetching and replacement policies significantly outperform conventional ones. ^ Anticipated applications of these solutions include biomedical engineering, tele-health, medical information systems and business. ^
Resumo:
Personal information is increasingly gathered and used for providing services tailored to user preferences, but the datasets used to provide such functionality can represent serious privacy threats if not appropriately protected. Work in privacy-preserving data publishing targeted privacy guarantees that protect against record re-identification, by making records indistinguishable, or sensitive attribute value disclosure, by introducing diversity or noise in the sensitive values. However, most approaches fail in the high-dimensional case, and the ones that don’t introduce a utility cost incompatible with tailored recommendation scenarios. This paper aims at a sensible trade-off between privacy and the benefits of tailored recommendations, in the context of privacy-preserving data publishing. We empirically demonstrate that significant privacy improvements can be achieved at a utility cost compatible with tailored recommendation scenarios, using a simple partition-based sanitization method.
Resumo:
It is important to examine the nature of the relationships between roadway, environmental, and traffic factors and motor vehicle crashes, with the aim to improve the collective understanding of causal mechanisms involved in crashes and to better predict their occurrence. Statistical models of motor vehicle crashes are one path of inquiry often used to gain these initial insights. Recent efforts have focused on the estimation of negative binomial and Poisson regression models (and related deviants) due to their relatively good fit to crash data. Of course analysts constantly seek methods that offer greater consistency with the data generating mechanism (motor vehicle crashes in this case), provide better statistical fit, and provide insight into data structure that was previously unavailable. One such opportunity exists with some types of crash data, in particular crash-level data that are collected across roadway segments, intersections, etc. It is argued in this paper that some crash data possess hierarchical structure that has not routinely been exploited. This paper describes the application of binomial multilevel models of crash types using 548 motor vehicle crashes collected from 91 two-lane rural intersections in the state of Georgia. Crash prediction models are estimated for angle, rear-end, and sideswipe (both same direction and opposite direction) crashes. The contributions of the paper are the realization of hierarchical data structure and the application of a theoretically appealing and suitable analysis approach for multilevel data, yielding insights into intersection-related crashes by crash type.
Resumo:
Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.
Resumo:
Genetic Algorithms (GAs) were used to design triangular lattice photonic crystals with large absolute band-gap. Considering fabricating issues, the algorithms represented the unit cell with large pixels and took the largest absolute band-gap under the fifth band as the objective function. By integrating Fourier transform data storage mechanism, the algorithms ran efficiently and effectively and optimized a triangular lattice photonic crystal with scatters in the shape of 'dielectric-air rod'. It had a large absolute band gap with relative width (ratio of gap width to midgap) 23.8%.
Resumo:
We used Plane Wave Expansion Method and a Rapid Genetic Algorithm to design two-dimensional photonic crystals with a large absolute band gap. A filling fraction controlling operator and Fourier transform data storage mechanism had been integrated into the genetic operators to get desired photonic crystals effectively and efficiently. Starting from randomly generated photonic crystals, the proposed RGA evolved toward the best objectives and yielded a square lattice photonic crystal with the band gap (defined as the gap to mid-gap ratio) as large as 13.25%. Furthermore, the evolutionary objective was modified and resulted in a satisfactory PC for better application to slab system.
Resumo:
In the present paper, the electrochemical behavior of ergosterol has been investigated by in situ circular dichroism (CD) spectroelectrochemistry with long path-length thin layer cell. E-0 (1.02V), alpha n(alpha) (0.302) of the electroxidation process of ergosterol were obtained from the CD spectroelectrochemical data. The mechanism of the electroxidation process of ergosterol is suggested.
Resumo:
网络状态信息收集协议既要保证信息收集的准确性、实时性,又要保证协议算法的轻量级特性。为解决上述矛盾问题,提出了一种轻量级的、能量有效的、基于无损聚合的层次分簇数据收集机制(QTBDC)。QTBDC首先对网络节点编码并在节点间建立起一个逻辑层次簇结构,然后利用各个子簇状态数据的相似性和编码的连续性,实现了网内无损聚合。该监测机制使得网络状态信息的收集在不丢失数据细节信息的情况下,数据通信量大大减少。经过仿真分析表明,该方法与现有经典数据收集方法相比,实现了节能,延长了网络的生命期。
Resumo:
A basic principle in data modelling is to incorporate available a priori information regarding the underlying data generating mechanism into the modelling process. We adopt this principle and consider grey-box radial basis function (RBF) modelling capable of incorporating prior knowledge. Specifically, we show how to explicitly incorporate the two types of prior knowledge: the underlying data generating mechanism exhibits known symmetric property and the underlying process obeys a set of given boundary value constraints. The class of orthogonal least squares regression algorithms can readily be applied to construct parsimonious grey-box RBF models with enhanced generalisation capability.
Resumo:
A fundamental principle in data modelling is to incorporate available a priori information regarding the underlying data generating mechanism into the modelling process. We adopt this principle and consider grey-box radial basis function (RBF) modelling capable of incorporating prior knowledge. Specifically, we show how to explicitly incorporate the two types of prior knowledge: (i) the underlying data generating mechanism exhibits known symmetric property, and (ii) the underlying process obeys a set of given boundary value constraints. The class of efficient orthogonal least squares regression algorithms can readily be applied without any modification to construct parsimonious grey-box RBF models with enhanced generalisation capability.
Resumo:
Malicious programs (malware) can cause severe damage on computer systems and data. The mechanism that the human immune system uses to detect and protect from organisms that threaten the human body is efficient and can be adapted to detect malware attacks. In this paper we propose a system to perform malware distributed collection, analysis and detection, this last inspired by the human immune system. After collecting malware samples from Internet, they are dynamically analyzed so as to provide execution traces at the operating system level and network flows that are used to create a behavioral model and to generate a detection signature. Those signatures serve as input to a malware detector, acting as the antibodies in the antigen detection process. This allows us to understand the malware attack and aids in the infection removal procedures. © 2012 Springer-Verlag.