931 resultados para Internet Security


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iris based identity verification is highly reliable but it can also be subject to attacks. Pupil dilation or constriction stimulated by the application of drugs are examples of sample presentation security attacks which can lead to higher false rejection rates. Suspects on a watch list can potentially circumvent the iris based system using such methods. This paper investigates a new approach using multiple parts of the iris (instances) and multiple iris samples in a sequential decision fusion framework that can yield robust performance. Results are presented and compared with the standard full iris based approach for a number of iris degradations. An advantage of the proposed fusion scheme is that the trade-off between detection errors can be controlled by setting parameters such as the number of instances and the number of samples used in the system. The system can then be operated to match security threat levels. It is shown that for optimal values of these parameters, the fused system also has a lower total error rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a decentralised communication technology, the Internet has offered much autonomy and unprecedented communication freedom to the Chinese public. Yet the Chinese government has imposed different forms of censorship over cyberspace. However, the Hong Kong erotic photo scandal reshuffles the traditional understanding of censorship in China as it points to a different territory. The paper takes the Hong Kong erotic photo scandal in 2008 as a case study and aims to examine the social and generational conflicts hidden in China. When thousands of photos containing sexually explicit images of Hong Kong celebrities were released on the Internet, gossip, controversies and eroticism fuelled the public discussion and threatened traditional values in China. The Internet provides an alternative space for the young Chinese who have been excluded from mainstream social discourse to engage in public debates. This, however, creates concerns, fear and even anger among the older generations in China, because they can no longer control, monitor and educate their children in the way that their predecessors have done for centuries. The photo scandal illustrates the internal social conflicts and distrust between generations in China and the generational conflict has a far-reaching political ramification as it creates a new concept of censorship.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Focuses on the various aspects of advances in future information communication technology and its applications Presents the latest issues and progress in the area of future information communication technology Applicable to both researchers and professionals These proceedings are based on the 2013 International Conference on Future Information & Communication Engineering (ICFICE 2013), which will be held at Shenyang in China from June 24-26, 2013. The conference is open to all over the world, and participation from Asia-Pacific region is particularly encouraged. The focus of this conference is on all technical aspects of electronics, information, and communications ICFICE-13 will provide an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of FICE. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in FICE. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject. "This work was supported by the NIPA (National IT Industry Promotion Agency) of Korea Grant funded by the Korean Government (Ministry of Science, ICT & Future Planning)."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study we develop a theorization of an Internet dating site as a cultural artifact. The site, Gaydar, is targeted at gay men. We argue that contemporary received representations of their sexuality figure heavily in the site’s focus by providing a cultural logic for the apparent ad hoc development trajectories of its varied commercial and non-­‐commercial services. More specifically, we suggest that the growing sets of services related to the website are heavily enmeshed within current social practices and meanings. These practices and meanings are, in turn, shaped by the interactions and preferences of a variety of diverse groups involved in what is routinely seen within the mainstream literature as a singularly specific sexuality and cultural project. Thus, we attend to two areas – the influence of the various social engagements associated with Gaydar together with the further extension of its trajectory ‘beyond the web’. Through the case of Gaydar, we contribute a study that recognizes the need for attention to sexuality in information systems research and one which illustrates sexuality as a pivotal aspect of culture. We also draw from anthropology to theorize ICTs as cultural artifacts and provide insights into the contemporary phenomena of ICT enabled social networking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most security models for authenticated key exchange (AKE) do not explicitly model the associated certification system, which includes the certification authority (CA) and its behaviour. However, there are several well-known and realistic attacks on AKE protocols which exploit various forms of malicious key registration and which therefore lie outside the scope of these models. We provide the first systematic analysis of AKE security incorporating certification systems (ASICS). We define a family of security models that, in addition to allowing different sets of standard AKE adversary queries, also permit the adversary to register arbitrary bitstrings as keys. For this model family we prove generic results that enable the design and verification of protocols that achieve security even if some keys have been produced maliciously. Our approach is applicable to a wide range of models and protocols; as a concrete illustration of its power, we apply it to the CMQV protocol in the natural strengthening of the eCK model to the ASICS setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dáwat, Pamahándí, Tawíd, Ságda, Lampísa, Ibabások, Lapát, Panedlák: for most of us gathered here, these are words that we don’t usually use in our daily lives. Others may consider them as exotic, alien, funny and even backward. However, for indigenous kindred among us, these words denote an intimate identity and deep understanding of the world around them. It constitutes a broader knowledge system, be written or otherwise, which guides them in the management of resources within their ancestral land. This paper will provide a brief theoretical framework of the concepts of indigenous knowledge systems—hereinafter called IKS, and indigenous peoples food security, and hopefully a deeper or continued appreciation in the study of both concepts in general.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advances in Information and Communication Technologies have the potential to improve many facets of modern healthcare service delivery. The implementation of electronic health records systems is a critical part of an eHealth system. Despite the potential gains, there are several obstacles that limit the wider development of electronic health record systems. Among these are the perceived threats to the security and privacy of patients’ health data, and a widely held belief that these cannot be adequately addressed. We hypothesise that the major concerns regarding eHealth security and privacy cannot be overcome through the implementation of technology alone. Human dimensions must be considered when analysing the provision of the three fundamental information security goals: confidentiality, integrity and availability. A sociotechnical analysis to establish the information security and privacy requirements when designing and developing a given eHealth system is important and timely. A framework that accommodates consideration of the legislative requirements and human perspectives in addition to the technological measures is useful in developing a measurable and accountable eHealth system. Successful implementation of this approach would enable the possibilities, practicalities and sustainabilities of proposed eHealth systems to be realised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a comprehensive formal security framework for key derivation functions (KDF). The major security goal for a KDF is to produce cryptographic keys from a private seed value where the derived cryptographic keys are indistinguishable from random binary strings. We form a framework of five security models for KDFs. This consists of four security models that we propose: Known Public Inputs Attack (KPM, KPS), Adaptive Chosen Context Information Attack (CCM) and Adaptive Chosen Public Inputs Attack(CPM); and another security model, previously defined by Krawczyk [6], which we refer to as Adaptive Chosen Context Information Attack(CCS). These security models are simulated using an indistinguisibility game. In addition we prove the relationships between these five security models and analyse KDFs using the framework (in the random oracle model).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information accountability is seen as a mode of usage control on the Web. Due to its many dimensions, information accountability has been expressed in various ways by computer scientists to address security and privacy in recent times. Information accountability is focused on how users participate in a system and the underlying policies that govern the participation. Healthcare is a domain in which the principles of information accountability can be utilised well. Modern health information systems are Internet based and the discipline is called eHealth. In this paper, we identify and discuss the goals of accountability systems and present the principles of information accountability. We characterise those principles in eHealth and discuss them contextually. We identify the current impediments to eHealth in terms of information privacy issues of eHealth consumers together with information usage requirements of healthcare providers and show how information accountability can be used in a healthcare context to address these needs. The challenges of implementing information accountability in eHealth are also discussed in terms of our efforts thus far.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Reliability of the performance of biometric identity verification systems remains a significant challenge. Individual biometric samples of the same person (identity class) are not identical at each presentation and performance degradation arises from intra-class variability and inter-class similarity. These limitations lead to false accepts and false rejects that are dependent. It is therefore difficult to reduce the rate of one type of error without increasing the other. The focus of this dissertation is to investigate a method based on classifier fusion techniques to better control the trade-off between the verification errors using text-dependent speaker verification as the test platform. A sequential classifier fusion architecture that integrates multi-instance and multisample fusion schemes is proposed. This fusion method enables a controlled trade-off between false alarms and false rejects. For statistically independent classifier decisions, analytical expressions for each type of verification error are derived using base classifier performances. As this assumption may not be always valid, these expressions are modified to incorporate the correlation between statistically dependent decisions from clients and impostors. The architecture is empirically evaluated by applying the proposed architecture for text dependent speaker verification using the Hidden Markov Model based digit dependent speaker models in each stage with multiple attempts for each digit utterance. The trade-off between the verification errors is controlled using the parameters, number of decision stages (instances) and the number of attempts at each decision stage (samples), fine-tuned on evaluation/tune set. The statistical validation of the derived expressions for error estimates is evaluated on test data. The performance of the sequential method is further demonstrated to depend on the order of the combination of digits (instances) and the nature of repetitive attempts (samples). The false rejection and false acceptance rates for proposed fusion are estimated using the base classifier performances, the variance in correlation between classifier decisions and the sequence of classifiers with favourable dependence selected using the 'Sequential Error Ratio' criteria. The error rates are better estimated by incorporating user-dependent (such as speaker-dependent thresholds and speaker-specific digit combinations) and class-dependent (such as clientimpostor dependent favourable combinations and class-error based threshold estimation) information. The proposed architecture is desirable in most of the speaker verification applications such as remote authentication, telephone and internet shopping applications. The tuning of parameters - the number of instances and samples - serve both the security and user convenience requirements of speaker-specific verification. The architecture investigated here is applicable to verification using other biometric modalities such as handwriting, fingerprints and key strokes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Streamciphers are common cryptographic algorithms used to protect the confidentiality of frame-based communications like mobile phone conversations and Internet traffic. Streamciphers are ideal cryptographic algorithms to encrypt these types of traffic as they have the potential to encrypt them quickly and securely, and have low error propagation. The main objective of this thesis is to determine whether structural features of keystream generators affect the security provided by stream ciphers.These structural features pertain to the state-update and output functions used in keystream generators. Using linear sequences as keystream to encrypt messages is known to be insecure. Modern keystream generators use nonlinear sequences as keystream.The nonlinearity can be introduced through a keystream generator's state-update function, output function, or both. The first contribution of this thesis relates to nonlinear sequences produced by the well-known Trivium stream cipher. Trivium is one of the stream ciphers selected in a final portfolio resulting from a multi-year project in Europe called the ecrypt project. Trivium's structural simplicity makes it a popular cipher to cryptanalyse, but to date, there are no attacks in the public literature which are faster than exhaustive keysearch. Algebraic analyses are performed on the Trivium stream cipher, which uses a nonlinear state-update and linear output function to produce keystream. Two algebraic investigations are performed: an examination of the sliding property in the initialisation process and algebraic analyses of Trivium-like streamciphers using a combination of the algebraic techniques previously applied separately by Berbain et al. and Raddum. For certain iterations of Trivium's state-update function, we examine the sets of slid pairs, looking particularly to form chains of slid pairs. No chains exist for a small number of iterations.This has implications for the period of keystreams produced by Trivium. Secondly, using our combination of the methods of Berbain et al. and Raddum, we analysed Trivium-like ciphers and improved on previous on previous analysis with regards to forming systems of equations on these ciphers. Using these new systems of equations, we were able to successfully recover the initial state of Bivium-A.The attack complexity for Bivium-B and Trivium were, however, worse than exhaustive keysearch. We also show that the selection of stages which are used as input to the output function and the size of registers which are used in the construction of the system of equations affect the success of the attack. The second contribution of this thesis is the examination of state convergence. State convergence is an undesirable characteristic in keystream generators for stream ciphers, as it implies that the effective session key size of the stream cipher is smaller than the designers intended. We identify methods which can be used to detect state convergence. As a case study, theMixer streamcipher, which uses nonlinear state-update and output functions to produce keystream, is analysed. Mixer is found to suffer from state convergence as the state-update function used in its initialisation process is not one-to-one. A discussion of several other streamciphers which are known to suffer from state convergence is given. From our analysis of these stream ciphers, three mechanisms which can cause state convergence are identified.The effect state convergence can have on stream cipher cryptanalysis is examined. We show that state convergence can have a positive effect if the goal of the attacker is to recover the initial state of the keystream generator. The third contribution of this thesis is the examination of the distributions of bit patterns in the sequences produced by nonlinear filter generators (NLFGs) and linearly filtered nonlinear feedback shift registers. We show that the selection of stages used as input to a keystream generator's output function can affect the distribution of bit patterns in sequences produced by these keystreamgenerators, and that the effect differs for nonlinear filter generators and linearly filtered nonlinear feedback shift registers. In the case of NLFGs, the keystream sequences produced when the output functions take inputs from consecutive register stages are less uniform than sequences produced by NLFGs whose output functions take inputs from unevenly spaced register stages. The opposite is true for keystream sequences produced by linearly filtered nonlinear feedback shift registers.