971 resultados para security protocols
Resumo:
Attachment difficulties have been proposed as a key risk factor for the development of alexithymia, a multifaceted personality trait characterised by difficulties identifying and describing feelings, a lack of imagination and an externally oriented thinking style. The present study investigated the relationship between attachment and alexithymia in an alcohol dependent population. Participants were 210 outpatients in a Cognitive Behavioural Treatment Program assessed on the Toronto Alexithymia Scale (TAS-20) and the Revised Adult Attachment Scale (RAAS). Significant relationships between anxious attachment and alexithymia factors were confirmed. Furthermore, alexithymic alcoholics reported significantly higher levels of anxious attachment and significantly lower levels of closeness (secure attachment) compared to non-alexithymic alcoholics. These findings highlight the importance of assessing and targeting anxious attachment among alexithymic alcoholics in order to improve alcohol treatment outcomes. Keywords: Attachment, alexithymia, alcohol dependence.
Resumo:
With the rise in attacks and attempted attacks on marine‐based critical infrastructure, maritime security is an issue of increasing importance worldwide. However, there are three significant shortfalls in the efforts to overcome potential threats to maritime security: the need for greater understanding of whether current standards of best practice are truly successful in combating and reducing the risks of terrorism and other security issues, the absence of a collective maritime security best practice framework and the need for improved access to maritime security specific graduate and postgraduate (long) courses. This paper presents an overview of existing international, regional national standards of best practice and shows that literature concerning the measurement and/ or success of standards is virtually non‐existent. In addition, despite the importance of maritime workers to ensuring the safety of marine based critical infrastructure, a similar review of available Australian education courses shows a considerable lack of availability of maritime security‐specific courses other than short courses that cover only basic security matters. We argue that the absence of an Australian best practice framework informed by evaluation of current policy responses – particularly in the post 9/11 environment – leaves Australia vulnerable to maritime security threats. As this paper shows, the reality is that despite the security measures put in place post 9/11, there is still considerable work to be done to ensure Australia is equipped to overcome the threats posed to maritime security.
Resumo:
Client puzzles are moderately-hard cryptographic problems neither easy nor impossible to solve that can be used as a counter-measure against denial of service attacks on network protocols. Puzzles based on modular exponentiation are attractive as they provide important properties such as non-parallelisability, deterministic solving time, and linear granularity. We propose an efficient client puzzle based on modular exponentiation. Our puzzle requires only a few modular multiplications for puzzle generation and verification. For a server under denial of service attack, this is a significant improvement as the best known non-parallelisable puzzle proposed by Karame and Capkun (ESORICS 2010) requires at least 2k-bit modular exponentiation, where k is a security parameter. We show that our puzzle satisfies the unforgeability and difficulty properties defined by Chen et al. (Asiacrypt 2009). We present experimental results which show that, for 1024-bit moduli, our proposed puzzle can be up to 30 times faster to verify than the Karame-Capkun puzzle and 99 times faster than the Rivest et al.'s time-lock puzzle.
Resumo:
Barreto-Lynn-Scott (BLS) curves are a stand-out candidate for implementing high-security pairings. This paper shows that particular choices of the pairing-friendly search parameter give rise to four subfami- lies of BLS curves, all of which offer highly efficient and implementation- friendly pairing instantiations. Curves from these particular subfamilies are defined over prime fields that support very efficient towering options for the full extension field. The coefficients for a specific curve and its correct twist are automat-ically determined without any computational effort. The choice of an extremely sparse search parameter is immediately reflected by a highly efficient optimal ate Miller loop and final exponentiation. As a resource for implementors, we give a list with examples of implementation-friendly BLS curves through several high-security levels.
Resumo:
Power system dynamic analysis and security assessment are becoming more significant today due to increases in size and complexity from restructuring, emerging new uncertainties, integration of renewable energy sources, distributed generation, and micro grids. Precise modelling of all contributed elements/devices, understanding interactions in detail, and observing hidden dynamics using existing analysis tools/theorems are difficult, and even impossible. In this chapter, the power system is considered as a continuum and the propagated electomechanical waves initiated by faults and other random events are studied to provide a new scheme for stability investigation of a large dimensional system. For this purpose, the measured electrical indices (such as rotor angle and bus voltage) following a fault in different points among the network are used, and the behaviour of the propagated waves through the lines, nodes, and buses is analyzed. The impact of weak transmission links on a progressive electromechanical wave using energy function concept is addressed. It is also emphasized that determining severity of a disturbance/contingency accurately, without considering the related electromechanical waves, hidden dynamics, and their properties is not secure enough. Considering these phenomena takes heavy and time consuming calculation, which is not suitable for online stability assessment problems. However, using a continuum model for a power system reduces the burden of complex calculations
Resumo:
In most of the digital image watermarking schemes, it becomes a common practice to address security in terms of robustness, which is basically a norm in cryptography. Such consideration in developing and evaluation of a watermarking scheme may severely affect the performance and render the scheme ultimately unusable. This paper provides an explicit theoretical analysis towards watermarking security and robustness in figuring out the exact problem status from the literature. With the necessary hypotheses and analyses from technical perspective, we demonstrate the fundamental realization of the problem. Finally, some necessary recommendations are made for complete assessment of watermarking security and robustness.
Resumo:
Purpose of review: This review provides an overview on the importance of characterising and considering insect distribution infor- mation for designing stored commodity sampling protocols. Findings: Sampling protocols are influenced by a number of factors including government regulations, management practices, new technology and current perceptions of the status of insect pest damage. The spatial distribution of insects in stored commodities influ- ences the efficiency of sampling protocols; these can vary in response to season, treatment and other factors. It is important to use sam- pling designs based on robust statistics suitable for the purpose. Future research: The development of sampling protocols based on flexible, robust statistics allows for accuracy across a range of spatial distributions. Additionally, power can be added to sampling protocols through the integration of external information such as treatment history and climate. Bayesian analysis provides a coherent and well understood means to achieve this.
Resumo:
Sustainable property practices will be essential for Australia’s future. The various levels of government offer incentives aimed at encouraging residents to participate in sustainable practices. Many of these programmes however are only accessible by owner occupiers, or landlords and tenants with long term tenancies. Improving security of tenure for tenants, to enable longer term tenancies, would positively impact upon property practices. This article explains what security of tenure is and identifies how a lack of security of tenure adversely impacts property practices. By comparison with Genevan property practices, it concludes by making suggestions as to how security of tenure can be reinforced.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.
Resumo:
Information security has been recognized as a core requirement for corporate governance that is expected to facilitate not only the management of risks, but also as a corporate enabler that supports and contributes to the sustainability of organizational operations. In implementing information security, the enterprise information security policy is the set of principles and strategies that guide the course of action for the security activities and may be represented as a brief statement that defines program goals and sets information security and risk requirements. The enterprise information security policy (alternatively referred to as security policy in this paper) that represents the meta-policy of information security is an element of corporate ICT governance and is derived from the strategic requirements for risk management and corporate governance. Consistent alignment between the security policy and the other corporate business policies and strategies has to be maintained if information security is to be implemented according to evolving business objectives. This alignment may be facilitated by managing security policy alongside other corporate business policies within the strategic management cycle. There are however limitations in current approaches for developing and managing the security policy to facilitate consistent strategic alignment. This paper proposes a conceptual framework for security policy management by presenting propositions to positively affect security policy alignment with business policies and prescribing a security policy management approach that expounds on the propositions.
Resumo:
Key establishment is a crucial primitive for building secure channels in a multi-party setting. Without quantum mechanics, key establishment can only be done under the assumption that some computational problem is hard. Since digital communication can be easily eavesdropped and recorded, it is important to consider the secrecy of information anticipating future algorithmic and computational discoveries which could break the secrecy of past keys, violating the secrecy of the confidential channel. Quantum key distribution (QKD) can be used generate secret keys that are secure against any future algorithmic or computational improvements. QKD protocols still require authentication of classical communication, although existing security proofs of QKD typically assume idealized authentication. It is generally considered folklore that QKD when used with computationally secure authentication is still secure against an unbounded adversary, provided the adversary did not break the authentication during the run of the protocol. We describe a security model for quantum key distribution extending classical authenticated key exchange (AKE) security models. Using our model, we characterize the long-term security of the BB84 QKD protocol with computationally secure authentication against an eventually unbounded adversary. By basing our model on traditional AKE models, we can more readily compare the relative merits of various forms of QKD and existing classical AKE protocols. This comparison illustrates in which types of adversarial environments different quantum and classical key agreement protocols can be secure.
Resumo:
We investigate existing cloud storage schemes and identify limitations in each one based on the security services that they provide. We then propose a new cloud storage architecture that extends CloudProof of Popa et al. to provide availability assurance. This is accomplished by incorporating a proof of storage protocol. As a result, we obtain the first secure storage cloud computing scheme that furnishes all three properties of availability, fairness and freshness.