991 resultados para certificate-based signatures


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The major purpose of Vehicular Ad Hoc Networks (VANETs) is to provide safety-related message access for motorists to react or make a life-critical decision for road safety enhancement. Accessing safety-related information through the use of VANET communications, therefore, must be protected, as motorists may make critical decisions in response to emergency situations in VANETs. If introducing security services into VANETs causes considerable transmission latency or processing delays, this would defeat the purpose of using VANETs to improve road safety. Current research in secure messaging for VANETs appears to focus on employing certificate-based Public Key Cryptosystem (PKC) to support security. The security overhead of such a scheme, however, creates a transmission delay and introduces a time-consuming verification process to VANET communications. This paper proposes an efficient public key management system for VANETs: the Public Key Registry (PKR) system. Not only does this paper demonstrate that the proposed PKR system can maintain security, but it also asserts that it can improve overall performance and scalability at a lower cost, compared to the certificate-based PKC scheme. It is believed that the proposed PKR system will create a new dimension to the key management and verification services for VANETs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The primary goal of the Vehicular Ad Hoc Network (VANET) is to provide real-time safety-related messages to motorists to enhance road safety. Accessing and disseminating safety-related information through the use of wireless communications technology in VANETs should be secured, as motorists may make critical decisions in dealing with an emergency situation based on the received information. If security concerns are not addressed in developing VANET systems, an adversary can tamper with, or suppress, the unprotected message to mislead motorists to cause traffic accidents and hazards. Current research on secure messaging in VANETs focuses on employing the certificate-based Public Key Infrastructure (PKI) scheme to support message encryption and digital signing. The security overhead of such a scheme, however, creates a transmission delay and introduces a time-consuming verification process to VANET communications. This thesis has proposed a novel public key verification and management approach for VANETs; namely, the Public Key Registry (PKR) regime. Compared to the VANET PKI scheme, this new approach can satisfy necessary security requirements with improved performance and scalability, and at a lower cost by reducing the security overheads of message transmission and eliminating digital certificate deployment and maintenance issues. The proposed PKR regime consists of the required infrastructure components, rules for public key management and verification, and a set of interactions and associated behaviours to meet these rule requirements. This is achieved through a system design as a logic process model with functional specifications. The PKR regime can be used as development guidelines for conforming implementations. An analysis and evaluation of the proposed PKR regime includes security features assessment, analysis of the security overhead of message transmission, transmission latency, processing latency, and scalability of the proposed PKR regime. Compared to certificate-based PKI approaches, the proposed PKR regime can maintain the necessary security requirements, significantly reduce the security overhead by approximately 70%, and improve the performance by 98%. Meanwhile, the result of the scalability evaluation shows that the latency of employing the proposed PKR regime stays much lower at approximately 15 milliseconds, whether operating in a huge or small environment. It is therefore believed that this research will create a new dimension to the provision of secure messaging services in VANETs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To protect the health information security, cryptography plays an important role to establish confidentiality, authentication, integrity and non-repudiation. Keys used for encryption/decryption and digital signing must be managed in a safe, secure, effective and efficient fashion. The certificate-based Public Key Infrastructure (PKI) scheme may seem to be a common way to support information security; however, so far, there is still a lack of successful large-scale certificate-based PKI deployment in the world. In addressing the limitations of the certificate-based PKI scheme, this paper proposes a non-certificate-based key management scheme for a national e-health implementation. The proposed scheme eliminates certificate management and complex certificate validation procedures while still maintaining security. It is also believed that this study will create a new dimension to the provision of security for the protection of health information in a national e-health environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An increasing number of countries are faced with an aging population increasingly needing healthcare services. For any e-health information system, the need for increased trust by such clients with potentially little knowledge of any security scheme involved is paramount. In addition notable scalability of any system has become a critical aspect of system design, development and ongoing management. Meanwhile cryptographic systems provide the security provisions needed for confidentiality, authentication, integrity and non-repudiation. Cryptographic key management, however, must be secure, yet efficient and effective in developing an attitude of trust in system users. Digital certificate-based Public Key Infrastructure has long been the technology of choice or availability for information security/assurance; however, there appears to be a notable lack of successful implementations and deployments globally. Moreover, recent issues with associated Certificate Authority security have damaged trust in these schemes. This paper proposes the adoption of a centralised public key registry structure, a non-certificate based scheme, for large scale e-health information systems. The proposed structure removes complex certificate management, revocation and a complex certificate validation structure while maintaining overall system security. Moreover, the registry concept may be easier for both healthcare professionals and patients to understand and trust.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper addresses the development of trust in the use of Open Data through incorporation of appropriate authentication and integrity parameters for use by end user Open Data application developers in an architecture for trustworthy Open Data Services. The advantages of this architecture scheme is that it is far more scalable, not another certificate-based hierarchy that has problems with certificate revocation management. With the use of a Public File, if the key is compromised: it is a simple matter of the single responsible entity replacing the key pair with a new one and re-performing the data file signing process. Under this proposed architecture, the the Open Data environment does not interfere with the internal security schemes that might be employed by the entity. However, this architecture incorporates, when needed, parameters from the entity, e.g. person who authorized publishing as Open Data, at the time that datasets are created/added.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Since its induction, the selective-identity (sID) model for identity-based cryptosystems and its relationship with various other notions of security has been extensively studied. As a result, it is a general consensus that the sID model is much weaker than the full-identity (ID) model. In this paper, we study the sID model for the particular case of identity-based signatures (IBS). The main focus is on the problem of constructing an ID-secure IBS given an sID-secure IBS without using random oracles-the so-called standard model-and with reasonable security degradation. We accomplish this by devising a generic construction which uses as black-box: i) a chameleon hash function and ii) a weakly-secure public-key signature. We argue that the resulting IBS is ID-secure but with a tightness gap of O(q(s)), where q(s) is the upper bound on the number of signature queries that the adversary is allowed to make. To the best of our knowledge, this is the first attempt at such a generic construction.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents proof-certificate based sufficient conditions for the existence of Zeno behavior in hybrid systems near non-isolated Zeno equilibria. To establish these conditions, we first prove sufficient conditions for Zeno behavior in a special class of hybrid systems termed first quadrant interval hybrid systems. The proof-certificate sufficient conditions are then obtained through a collection of functions that effectively "reduce" a general hybrid system to a first quadrant interval hybrid system. This paper concludes with an application of these ideas to Lagrangian hybrid systems, resulting in easily verifiable sufficient conditions for Zeno behavior. © 2008 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND:

We have recently identified a number of Quantitative Trait Loci (QTL) contributing to the 2-fold muscle weight difference between the LG/J and SM/J mouse strains and refined their confidence intervals. To facilitate nomination of the candidate genes responsible for these differences we examined the transcriptome of the tibialis anterior (TA) muscle of each strain by RNA-Seq.

RESULTS:

13,726 genes were expressed in mouse skeletal muscle. Intersection of a set of 1061 differentially expressed transcripts with a mouse muscle Bayesian Network identified a coherent set of differentially expressed genes that we term the LG/J and SM/J Regulatory Network (LSRN). The integration of the QTL, transcriptome and the network analyses identified eight key drivers of the LSRN (Kdr, Plbd1, Mgp, Fah, Prss23, 2310014F06Rik, Grtp1, Stk10) residing within five QTL regions, which were either polymorphic or differentially expressed between the two strains and are strong candidates for quantitative trait genes (QTGs) underlying muscle mass. The insight gained from network analysis including the ability to make testable predictions is illustrated by annotating the LSRN with knowledge-based signatures and showing that the SM/J state of the network corresponds to a more oxidative state. We validated this prediction by NADH tetrazolium reductase staining in the TA muscle revealing higher oxidative potential of the SM/J compared to the LG/J strain (p<0.03).

CONCLUSION:

Thus, integration of fine resolution QTL mapping, RNA-Seq transcriptome information and mouse muscle Bayesian Network analysis provides a novel and unbiased strategy for nomination of muscle QTGs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The high-level sensitivity of medical information mandates stronger authentication and authorization mechanisms to be used in e-Health systems. This paper describes the design and implementation of certificate-based e-Health authentication and authorization architecture. This architecture was developed to authenticate e-Health professionals accessing shared clinical data among a set of affiliated health institutions based on peer-to- peer networks. The architecture had to accommodate specific medical data sharing and handling requirements, namely the security of professionals' credentials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Certificate verification in PKI is a complex and time consuming process. In the classical PKI methodology, in order to obtain a public key and to accept a certificate as valid, a verifier needs to extract a certificate path from the PKI and to verify the certificates on this path recursively. Levi proposed a nested certificate model vvith the aim to simplify and speed up certificate verification. Such a nested certificate-based PKI significantly improves certificate verification, but it also requires a large increase in the number of issued certificates, which makes this model impractical for real life deployment. In order to solve this drawback of nested PKI, while retaining its speed in certificate verification, we propose in this paper the innovative concept of a compressed nested certificate, which is a significantly modified version of the nested certificate model. Compressed nested certificate PKI deploys compressed nested certificates which speed up and simplify certificate verification while keeping certificate load to a minimum, thus providing implementers the option of integrating it into the existing PKI model or building it separately as an independent model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Signature-based malware detection systems have been a much used response to the pervasive problem of malware. Identification of malware variants is essential to a detection system and is made possible by identifying invariant characteristics in related samples. To classify the packed and polymorphic malware, this paper proposes a novel system, named Malwise, for malware classification using a fast application-level emulator to reverse the code packing transformation, and two flowgraph matching algorithms to perform classification. An exact flowgraph matching algorithm is employed that uses string-based signatures, and is able to detect malware with near real-time performance. Additionally, a more effective approximate flowgraph matching algorithm is proposed that uses the decompilation technique of structuring to generate string-based signatures amenable to the string edit distance. We use real and synthetic malware to demonstrate the effectiveness and efficiency of Malwise. Using more than 15,000 real malware, collected from honeypots, the effectiveness is validated by showing that there is an 88 percent probability that new malware is detected as a variant of existing malware. The efficiency is demonstrated from a smaller sample set of malware where 86 percent of the samples can be classified in under 1.3 seconds.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nowadays due to the security vulnerability of distributed systems, it is needed mechanisms to guarantee the security requirements of distributed objects communications. Middleware Platforms component integration platforms provide security functions that typically offer services for auditing, for guarantee messages protection, authentication, and access control. In order to support these functions, middleware platforms use digital certificates that are provided and managed by external entities. However, most middleware platforms do not define requirements to get, to maintain, to validate and to delegate digital certificates. In addition, most digital certification systems use X.509 certificates that are complex and have a lot of attributes. In order to address these problems, this work proposes a digital certification generic service for middleware platforms. This service provides flexibility via the joint use of public key certificates, to implement the authentication function, and attributes certificates to the authorization function. It also supports delegation. Certificate based access control is transparent for objects. The proposed service defines the digital certificate format, the store and retrieval system, certificate validation and support for delegation. In order to validate the proposed architecture, this work presents the implementation of the digital certification service for the CORBA middleware platform and a case study that illustrates the service functionalities

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Any automatically measurable, robust and distinctive physical characteristic or personal trait that can be used to identify an individual or verify the claimed identity of an individual, referred to as biometrics, has gained significant interest in the wake of heightened concerns about security and rapid advancements in networking, communication and mobility. Multimodal biometrics is expected to be ultra-secure and reliable, due to the presence of multiple and independent—verification clues. In this study, a multimodal biometric system utilising audio and facial signatures has been implemented and error analysis has been carried out. A total of one thousand face images and 250 sound tracks of 50 users are used for training the proposed system. To account for the attempts of the unregistered signatures data of 25 new users are tested. The short term spectral features were extracted from the sound data and Vector Quantization was done using K-means algorithm. Face images are identified based on Eigen face approach using Principal Component Analysis. The success rate of multimodal system using speech and face is higher when compared to individual unimodal recognition systems

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We study a series of transient entries into the low-latitude boundary layer (LLBL) of all four Cluster spacecraft during an outbound pass through the mid-afternoon magnetopause ([X(GSM), Y(GSM), Z(GSM)] approximate to [2, 7, 9] R(E)). The events take place during an interval of northward IMF, as seen in the data from the ACE satellite and lagged by a propagation delay of 75 min that is well-defined by two separate studies: (1) the magnetospheric variations prior to the northward turning (Lockwood et al., 2001, this issue) and (2) the field clock angle seen by Cluster after it had emerged into the magnetosheath (Opgenoorth et al., 2001, this issue). With an additional lag of 16.5 min, the transient LLBL events cor-relate well with swings of the IMF clock angle (in GSM) to near 90degrees. Most of this additional lag is explained by ground-based observations, which reveal signatures of transient reconnection in the pre-noon sector that then take 10-15 min to propagate eastward to 15 MLT, where they are observed by Cluster. The eastward phase speed of these signatures agrees very well with the motion deduced by the cross-correlation of the signatures seen on the four Cluster spacecraft. The evidence that these events are reconnection pulses includes: transient erosion of the noon 630 nm (cusp/cleft) aurora to lower latitudes; transient and travelling enhancements of the flow into the polar cap, imaged by the AMIE technique; and poleward-moving events moving into the polar cap, seen by the EISCAT Svalbard Radar (ESR). A pass of the DMSP-F15 satellite reveals that the open field lines near noon have been opened for some time: the more recently opened field lines were found closer to dusk where the flow transient and the poleward-moving event intersected the satellite pass. The events at Cluster have ion and electron characteristics predicted and observed by Lockwood and Hapgood (1998) for a Flux Transfer Event (FTE), with allowance for magnetospheric ion reflection at Alfvenic disturbances in the magnetopause reconnection layer. Like FTEs, the events are about 1 R(E) in their direction of motion and show a rise in the magnetic field strength, but unlike FTEs, in general, they show no pressure excess in their core and hence, no characteristic bipolar signature in the boundary-normal component. However, most of the events were observed when the magnetic field was southward, i.e. on the edge of the interior magnetic cusp, or when the field was parallel to the magnetic equatorial plane. Only when the satellite begins to emerge from the exterior boundary (when the field was northward), do the events start to show a pressure excess in their core and the consequent bipolar signature. We identify the events as the first observations of FTEs at middle altitudes.