879 resultados para Information security culture


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The QUT-NOISE-SRE protocol is designed to mix the large QUT-NOISE database, consisting of over 10 hours of back- ground noise, collected across 10 unique locations covering 5 common noise scenarios, with commonly used speaker recognition datasets such as Switchboard, Mixer and the speaker recognition evaluation (SRE) datasets provided by NIST. By allowing common, clean, speech corpora to be mixed with a wide variety of noise conditions, environmental reverberant responses, and signal-to-noise ratios, this protocol provides a solid basis for the development, evaluation and benchmarking of robust speaker recognition algorithms, and is freely available to download alongside the QUT-NOISE database. In this work, we use the QUT-NOISE-SRE protocol to evaluate a state-of-the-art PLDA i-vector speaker recognition system, demonstrating the importance of designing voice-activity-detection front-ends specifically for speaker recognition, rather than aiming for perfect coherence with the true speech/non-speech boundaries.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Real-world cryptographic protocols such as the widely used Transport Layer Security (TLS) protocol support many different combinations of cryptographic algorithms (called ciphersuites) and simultaneously support different versions. Recent advances in provable security have shown that most modern TLS ciphersuites are secure authenticated and confidential channel establishment (ACCE) protocols, but these analyses generally focus on single ciphersuites in isolation. In this paper we extend the ACCE model to cover protocols with many different sub-protocols, capturing both multiple ciphersuites and multiple versions, and define a security notion for secure negotiation of the optimal sub-protocol. We give a generic theorem that shows how secure negotiation follows, with some additional conditions, from the authentication property of secure ACCE protocols. Using this framework, we analyse the security of ciphersuite and three variants of version negotiation in TLS, including a recently proposed mechanism for detecting fallback attacks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of big data has already outperformed traditional data management efforts in almost all industries. Other instances it has succeeded in obtaining promising results that provide value from large-scale integration and analysis of heterogeneous data sources for example Genomic and proteomic information. Big data analytics have become increasingly important in describing the data sets and analytical techniques in software applications that are so large and complex due to its significant advantages including better business decisions, cost reduction and delivery of new product and services [1]. In a similar context, the health community has experienced not only more complex and large data content, but also information systems that contain a large number of data sources with interrelated and interconnected data attributes. That have resulted in challenging, and highly dynamic environments leading to creation of big data with its enumerate complexities, for instant sharing of information with the expected security requirements of stakeholders. When comparing big data analysis with other sectors, the health sector is still in its early stages. Key challenges include accommodating the volume, velocity and variety of healthcare data with the current deluge of exponential growth. Given the complexity of big data, it is understood that while data storage and accessibility are technically manageable, the implementation of Information Accountability measures to healthcare big data might be a practical solution in support of information security, privacy and traceability measures. Transparency is one important measure that can demonstrate integrity which is a vital factor in the healthcare service. Clarity about performance expectations is considered to be another Information Accountability measure which is necessary to avoid data ambiguity and controversy about interpretation and finally, liability [2]. According to current studies [3] Electronic Health Records (EHR) are key information resources for big data analysis and is also composed of varied co-created values [3]. Common healthcare information originates from and is used by different actors and groups that facilitate understanding of the relationship for other data sources. Consequently, healthcare services often serve as an integrated service bundle. Although a critical requirement in healthcare services and analytics, it is difficult to find a comprehensive set of guidelines to adopt EHR to fulfil the big data analysis requirements. Therefore as a remedy, this research work focus on a systematic approach containing comprehensive guidelines with the accurate data that must be provided to apply and evaluate big data analysis until the necessary decision making requirements are fulfilled to improve quality of healthcare services. Hence, we believe that this approach would subsequently improve quality of life.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unified communications as a service (UCaaS) can be regarded as a cost-effective model for on-demand delivery of unified communications services in the cloud. However, addressing security concerns has been seen as the biggest challenge to the adoption of IT services in the cloud. This study set up a cloud system via VMware suite to emulate hosting unified communications (UC), the integration of two or more real time communication systems, services in the cloud in a laboratory environment. An Internet Protocol Security (IPSec) gateway was also set up to support network-level security for UCaaS against possible security exposures. This study was aimed at analysis of an implementation of UCaaS over IPSec and evaluation of the latency of encrypted UC traffic while protecting that traffic. Our test results show no latency while IPSec is implemented with a G.711 audio codec. However, the performance of the G.722 audio codec with an IPSec implementation affects the overall performance of the UC server. These results give technical advice and guidance to those involved in security controls in UC security on premises as well as in the cloud.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many software applications extend their functionality by dynamically loading libraries into their allocated address space. However, shared libraries are also often of unknown provenance and quality and may contain accidental bugs or, in some cases, deliberately malicious code. Most sandboxing techniques which address these issues require recompilation of the libraries using custom tool chains, require significant modifications to the libraries, do not retain the benefits of single address-space programming, do not completely isolate guest code, or incur substantial performance overheads. In this paper we present LibVM, a sandboxing architecture for isolating libraries within a host application without requiring any modifications to the shared libraries themselves, while still retaining the benefits of a single address space and also introducing a system call inter-positioning layer that allows complete arbitration over a shared library’s functionality. We show how to utilize contemporary hardware virtualization support towards this end with reasonable performance overheads and, in the absence of such hardware support, our model can also be implemented using a software-based mechanism. We ensure that our implementation conforms as closely as possible to existing shared library manipulation functions, minimizing the amount of effort needed to apply such isolation to existing programs. Our experimental results show that it is easy to gain immediate benefits in scenarios where the goal is to guard the host application against unintentional programming errors when using shared libraries, as well as in more complex scenarios, where a shared library is suspected of being actively hostile. In both cases, no changes are required to the shared libraries themselves.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this note the authors examine two cases, one from Australia, the other from New Zealand, both of which explore the responsibility of legal practitioners engaged as professionals in the buying and selling of land. What is shown is that the allocation of risk and responsibility is constantly under scrutiny for those involved in the conveyancing process, something which the nascent Australian electronic conveyancing protocols will only heighten.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Digital signatures are often used by trusted authorities to make unique bindings between a subject and a digital object; for example, certificate authorities certify a public key belongs to a domain name, and time-stamping authorities certify that a certain piece of information existed at a certain time. Traditional digital signature schemes however impose no uniqueness conditions, so a trusted authority could make multiple certifications for the same subject but different objects, be it intentionally, by accident, or following a (legal or illegal) coercion. We propose the notion of a double-authentication-preventing signature, in which a value to be signed is split into two parts: a subject and a message. If a signer ever signs two different messages for the same subject, enough information is revealed to allow anyone to compute valid signatures on behalf of the signer. This double-signature forgeability property discourages signers from misbehaving—a form of self-enforcement—and would give binding authorities like CAs some cryptographic arguments to resist legal coercion. We give a generic construction using a new type of trapdoor functions with extractability properties, which we show can be instantiated using the group of sign-agnostic quadratic residues modulo a Blum integer; we show an additional application of these new extractable trapdoor functions to standard digital signatures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Distributed Network Protocol v3.0 (DNP3) is one of the most widely used protocols to control national infrastructure. The move from point-to-point serial connections to Ethernet-based network architectures, allowing for large and complex critical infrastructure networks. However, networks and con- figurations change, thus auditing tools are needed to aid in critical infrastructure network discovery. In this paper we present a series of intrusive techniques used for reconnaissance on DNP3 critical infrastructure. Our algorithms will discover DNP3 outstation slaves along with their DNP3 addresses, their corresponding master, and class object configurations. To validate our presented DNP3 reconnaissance algorithms and demonstrate it’s practicality, we present an implementation of a software tool using a DNP3 plug-in for Scapy. Our implementation validates the utility of our DNP3 reconnaissance technique. Our presented techniques will be useful for penetration testing, vulnerability assessments and DNP3 network discovery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Certain software products employing digital techniques for encryption of data are subject to export controls in the EU Member States pursuant to Community law and relevant laws in the Member States. These controls are agreed globally in the framework of the so-called Wassenaar Arrangement. Wassenaar is an informal non-proliferation regime aimed at promoting international stability and responsibility in transfers of strategic (dual-use) products and technology. This thesis covers provisions of Wassenaar, Community export control laws and export control laws of Finland, Sweden, Germany, France and United Kingdom. This thesis consists of five chapters. The first chapter discusses the ratio of export control laws and the impact they have on global trade. The ratio is originally defence-related - in general to prevent potential adversaries of participating States from having the same tools, and in particular in the case of cryptographic software to enable signals intelligence efforts. Increasingly as the use of cryptography in a civilian context has mushroomed, export restrictions can have negative effects on civilian trade. Information security solutions may also be took weak because of export restrictions on cryptography. The second chapter covers the OECD's Cryptography Policy, which had a significant effect on its member nations' national cryptography policies and legislation. The OECD is a significant organization,because it acts as a meeting forum for most important industrialized nations. The third chapter covers the Wassenaar Arrangement. The Arrangement is covered from the viewpoint of international law and politics. The Wassenaar control list provisions affecting cryptographic software transfers are also covered in detail. Control lists in the EU and in Member States are usually directly copied from Wassenaar control lists. Controls agreed in its framework set only a minimum level for participating States. However, Wassenaar countries can adopt stricter controls. The fourth chapter covers Community export control law. Export controls are viewed in Community law as falling within the domain of Common Commercial Policy pursuant to Article 133 of the EC Treaty. Therefore the Community has exclusive competence in export matters, save where a national measure is authorized by the Community or falls under foreign or security policy derogations established in Community law. The Member States still have a considerable amount of power in the domain of Common Foreign and Security Policy. They are able to maintain national export controls because export control laws are not fully harmonized. This can also have possible detrimental effects on the functioning of internal market and common export policies. In 1995 the EU adopted Dual-Use Regulation 3381/94/EC, which sets common rules for exports in Member States. Provisions of this regulation receive detailed coverage in this chapter. The fifth chapter covers national legislation and export authorization practices in five different Member States - in Finland, Sweden, Germany, France and in United Kingdom. Export control laws of those Member States are covered when the national laws differ from the uniform approach of the Community's acquis communautaire. Keywords: export control, encryption, software, dual-use, license, foreign trade, e-commerce, Internet

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Title insurance companies originating from America, have, in the past 15 years become part of the Australian conveyancing landscape. However for most residential freehold owners, their activities would be a mystery. A purchaser does not routinely obtain title insurance, with the companies presently focussing on servicing the mortgagee sector. While the lack of penetration in the residential purchaser market may be attributed to the consumer’s lack of knowledge, evidence from Ontario and New Zealand illustrates that title insurance is likely to become an additional cost in the conveyancing process in Australia. In this article we highlight the reasons why, and demonstrate how title insurers have, by working with the legal profession been able to subtly move the risk of responsibility for compensation for loss, (at least in the first instance) from the state to the insurer, but with the added benefit for the state and the conveyancing agents that the cost of the insurance is ultimately borne by the consumer. In New Zealand this development is being accelerated by the introduction of capped conveyancing title insurance. Whether title insurance will become part of the conveyancing process is no longer the relevant question for Australia, (it undoubtedly will), but the unknown issue is just how title insurance companies will work with conveyancing agents to infiltrate the market, and what response this infiltration will have in terms of the state’s view as to their continued role in the provision of assurance. We suggest that developments from New Zealand in relation to capped conveyancing insurance are likely to be replicated in Australia in the near future, and that the state’s role in providing an assurance fund will continue, though the state may seek to expand the areas in which the right to compensation is restricted.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The minimum distance of linear block codes is one of the important parameter that indicates the error performance of the code. When the code rate is less than 1/2, efficient algorithms are available for finding minimum distance using the concept of information sets. When the code rate is greater than 1/2, only one information set is available and efficiency suffers. In this paper, we investigate and propose a novel algorithm to find the minimum distance of linear block codes with the code rate greater than 1/2. We propose to reverse the roles of information set and parity set to get virtually another information set to improve the efficiency. This method is 67.7 times faster than the minimum distance algorithm implemented in MAGMA Computational Algebra System for a (80, 45) linear block code.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hybrid wireless networks are extensively used in the superstores, market places, malls, etc. and provide high QoS (Quality of Service) to the end-users has become a challenging task. In this paper, we propose a policy-based transaction-aware QoS management architecture in a hybrid wireless superstore environment. The proposed scheme operates at the transaction level, for the downlink QoS management. We derive a policy for the estimation of QoS parameters, like, delay, jitter, bandwidth, availability, packet loss for every transaction before scheduling on the downlink. We also propose a QoS monitor which monitors the specified QoS and automatically adjusts the QoS according to the requirement. The proposed scheme has been simulated in hybrid wireless superstore environment and tested for various superstore transactions. The results shows that the policy-based transaction QoS management is enhance the performance and utilize network resources efficiently at the peak time of the superstore business.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We propose a generic three-pass key agreement protocol that is based on a certain kind of trapdoor one-way function family. When specialized to the RSA setting, the generic protocol yields the so-called KAS2 scheme that has recently been standardized by NIST. On the other hand, when specialized to the discrete log setting, we obtain a new protocol which we call DH2. An interesting feature of DH2 is that parties can use different groups (e.g., different elliptic curves). The generic protocol also has a hybrid implementation, where one party has an RSA key pair and the other party has a discrete log key pair. The security of KAS2 and DH2 is analyzed in an appropriate modification of the extended Canetti-Krawczyk security model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

with the development of large scale wireless networks, there has been short comings and limitations in traditional network topology management systems. In this paper, an adaptive algorithm is proposed to maintain topology of hybrid wireless superstore network by considering the transactions and individual network load. The adaptations include to choose the best network connection for the response, and to perform network Connection switching when network situation changes. At the same time, in terms of the design for topology management systems, aiming at intelligence, real-time, the study makes a step-by-step argument and research on the overall topology management scheme. Architecture for the adaptive topology management of hybrid wireless networking resources is available to user’s mobile device. Simulation results describes that the new scheme has outperformed the original topology management and it is simpler than the original rate borrowing scheme.