251 resultados para Security assurance
Resumo:
All organisations, irrespective of size and type, need effective information security management (ISM) practices to protect vital organisational in- formation assets. However, little is known about the information security management practices of nonprofit organisations. Australian nonprofit organisations (NPOs) employed 889,900 people, managed 4.6 million volunteers and contributed $40,959 million to the economy during 2006-2007 (Australian Bureau of Statistics, 2009). This thesis describes the perceptions of information security management in two Australian NPOs and examines the appropriateness of the ISO 27002 information security management standard in an NPO context. The overall approach to the research is interpretive. A collective case study has been performed, consisting of two instrumental case studies with the researcher being embedded within two NPOs for extended periods of time. Data gathering and analysis was informed by grounded theory and action research, and the Technology Acceptance Model was utilised as a lens to explore the findings and provide limited generalisability to other contexts. The major findings include a distinct lack of information security management best practice in both organisations. ISM Governance and risk management was lacking and ISM policy was either outdated or non- existent. While some user focused ISM practices were evident, reference to standards, such as ISO 27002, were absent. The main factor that negatively impacted on ISM practices was the lack of resources available for ISM in the NPOs studied. Two novel aspects of information security dis- covered in this research were the importance of accuracy and consistency of information. The contribution of this research is a preliminary understanding of ISM practices and perceptions in NPOs. Recommendations for a new approach to managing information security management in nonprofit organisations have been proposed.
Resumo:
Assurance of learning is a predominant feature in both quality enhancement and assurance in higher education. Assurance of learning is a process that articulates explicit program outcomes and standards, and systematically gathers evidence to determine the extent to which performance matches expectations. Benefits accrue to the institution through the systematic assessment of whole of program goals. Data may be used for continuous improvement, program development, and to inform external accreditation and evaluation bodies. Recent developments, including the introduction of the Tertiary Education and Quality Standards Agency (TEQSA) will require universities to review the methods they use to assure learning outcomes. This project investigates two critical elements of assurance of learning: 1. the mapping of graduate attributes throughout a program; and 2. the collection of assurance of learning data. An audit was conducted with 25 of the 39 Business Schools in Australian universities to identify current methods of mapping graduate attributes and for collecting assurance of learning data across degree programs, as well as a review of the key challenges faced in these areas. Our findings indicate that external drivers like professional body accreditation (for example: Association to Advance Collegiate Schools of Business (AACSB)) and TEQSA are important motivators for assuring learning, and those who were undertaking AACSB accreditation had more robust assurance of learning systems in place. It was reassuring to see that the majority of institutions (96%) had adopted an embedding approach to assuring learning rather than opting for independent standardised testing. The main challenges that were evident were the development of sustainable processes that were not considered a burden to academic staff, and obtainment of academic buy in to the benefits of assuring learning per se rather than assurance of learning being seen as a tick box exercise. This cultural change is the real challenge in assurance of learning practice.
Resumo:
Defence organisations perform information security evaluations to confirm that electronic communications devices are safe to use in security-critical situations. Such evaluations include tracing all possible dataflow paths through the device, but this process is tedious and error-prone, so automated reachability analysis tools are needed to make security evaluations faster and more accurate. Previous research has produced a tool, SIFA, for dataflow analysis of basic digital circuitry, but it cannot analyse dataflow through microprocessors embedded within the circuit since this depends on the software they run. We have developed a static analysis tool that produces SIFA compatible dataflow graphs from embedded microcontroller programs written in C. In this paper we present a case study which shows how this new capability supports combined hardware and software dataflow analyses of a security critical communications device.
Resumo:
Attachment difficulties have been proposed as a key risk factor for the development of alexithymia, a multifaceted personality trait characterised by difficulties identifying and describing feelings, a lack of imagination and an externally oriented thinking style. The present study investigated the relationship between attachment and alexithymia in an alcohol dependent population. Participants were 210 outpatients in a Cognitive Behavioural Treatment Program assessed on the Toronto Alexithymia Scale (TAS-20) and the Revised Adult Attachment Scale (RAAS). Significant relationships between anxious attachment and alexithymia factors were confirmed. Furthermore, alexithymic alcoholics reported significantly higher levels of anxious attachment and significantly lower levels of closeness (secure attachment) compared to non-alexithymic alcoholics. These findings highlight the importance of assessing and targeting anxious attachment among alexithymic alcoholics in order to improve alcohol treatment outcomes. Keywords: Attachment, alexithymia, alcohol dependence.
Resumo:
Various time-memory tradeoffs attacks for stream ciphers have been proposed over the years. However, the claimed success of these attacks assumes the initialisation process of the stream cipher is one-to-one. Some stream cipher proposals do not have a one-to-one initialisation process. In this paper, we examine the impact of this on the success of time-memory-data tradeoff attacks. Under the circumstances, some attacks are more successful than previously claimed while others are less. The conditions for both cases are established.
Resumo:
With the rise in attacks and attempted attacks on marine‐based critical infrastructure, maritime security is an issue of increasing importance worldwide. However, there are three significant shortfalls in the efforts to overcome potential threats to maritime security: the need for greater understanding of whether current standards of best practice are truly successful in combating and reducing the risks of terrorism and other security issues, the absence of a collective maritime security best practice framework and the need for improved access to maritime security specific graduate and postgraduate (long) courses. This paper presents an overview of existing international, regional national standards of best practice and shows that literature concerning the measurement and/ or success of standards is virtually non‐existent. In addition, despite the importance of maritime workers to ensuring the safety of marine based critical infrastructure, a similar review of available Australian education courses shows a considerable lack of availability of maritime security‐specific courses other than short courses that cover only basic security matters. We argue that the absence of an Australian best practice framework informed by evaluation of current policy responses – particularly in the post 9/11 environment – leaves Australia vulnerable to maritime security threats. As this paper shows, the reality is that despite the security measures put in place post 9/11, there is still considerable work to be done to ensure Australia is equipped to overcome the threats posed to maritime security.
Resumo:
Barreto-Lynn-Scott (BLS) curves are a stand-out candidate for implementing high-security pairings. This paper shows that particular choices of the pairing-friendly search parameter give rise to four subfami- lies of BLS curves, all of which offer highly efficient and implementation- friendly pairing instantiations. Curves from these particular subfamilies are defined over prime fields that support very efficient towering options for the full extension field. The coefficients for a specific curve and its correct twist are automat-ically determined without any computational effort. The choice of an extremely sparse search parameter is immediately reflected by a highly efficient optimal ate Miller loop and final exponentiation. As a resource for implementors, we give a list with examples of implementation-friendly BLS curves through several high-security levels.
Resumo:
This study explores academic perceptions of organizational capability and culture following a project to develop a quality assurance of learning program in a business school. In the project a community of practice structure was established to include academics in the development of an embedded, direct assurance of learning program affecting more than 5000 undergraduate students and 250 academics from nine different disciplines across four discipline based departments. The primary outcome from the newly developed and implemented assurance of learning program was the five year accreditation of the business school’s programs by two international accrediting bodies, EQUIS and AACSB. This study explores a different outcome, namely perceptions of organizational culture and individual capabilities as academics worked together in teaching teams and communities. This study uses a survey and interviews with academics involved, through a retrospective panel design consisting of an experimental group and a control group. Results offer insights into communities of practice as a means of encouraging new individual and organizational capability and strategic culture adaptation.
Resumo:
Power system dynamic analysis and security assessment are becoming more significant today due to increases in size and complexity from restructuring, emerging new uncertainties, integration of renewable energy sources, distributed generation, and micro grids. Precise modelling of all contributed elements/devices, understanding interactions in detail, and observing hidden dynamics using existing analysis tools/theorems are difficult, and even impossible. In this chapter, the power system is considered as a continuum and the propagated electomechanical waves initiated by faults and other random events are studied to provide a new scheme for stability investigation of a large dimensional system. For this purpose, the measured electrical indices (such as rotor angle and bus voltage) following a fault in different points among the network are used, and the behaviour of the propagated waves through the lines, nodes, and buses is analyzed. The impact of weak transmission links on a progressive electromechanical wave using energy function concept is addressed. It is also emphasized that determining severity of a disturbance/contingency accurately, without considering the related electromechanical waves, hidden dynamics, and their properties is not secure enough. Considering these phenomena takes heavy and time consuming calculation, which is not suitable for online stability assessment problems. However, using a continuum model for a power system reduces the burden of complex calculations
Resumo:
In most of the digital image watermarking schemes, it becomes a common practice to address security in terms of robustness, which is basically a norm in cryptography. Such consideration in developing and evaluation of a watermarking scheme may severely affect the performance and render the scheme ultimately unusable. This paper provides an explicit theoretical analysis towards watermarking security and robustness in figuring out the exact problem status from the literature. With the necessary hypotheses and analyses from technical perspective, we demonstrate the fundamental realization of the problem. Finally, some necessary recommendations are made for complete assessment of watermarking security and robustness.
Resumo:
Sustainable property practices will be essential for Australia’s future. The various levels of government offer incentives aimed at encouraging residents to participate in sustainable practices. Many of these programmes however are only accessible by owner occupiers, or landlords and tenants with long term tenancies. Improving security of tenure for tenants, to enable longer term tenancies, would positively impact upon property practices. This article explains what security of tenure is and identifies how a lack of security of tenure adversely impacts property practices. By comparison with Genevan property practices, it concludes by making suggestions as to how security of tenure can be reinforced.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.