818 resultados para high security
em Queensland University of Technology - ePrints Archive
Resumo:
Barreto-Lynn-Scott (BLS) curves are a stand-out candidate for implementing high-security pairings. This paper shows that particular choices of the pairing-friendly search parameter give rise to four subfami- lies of BLS curves, all of which offer highly efficient and implementation- friendly pairing instantiations. Curves from these particular subfamilies are defined over prime fields that support very efficient towering options for the full extension field. The coefficients for a specific curve and its correct twist are automat-ically determined without any computational effort. The choice of an extremely sparse search parameter is immediately reflected by a highly efficient optimal ate Miller loop and final exponentiation. As a resource for implementors, we give a list with examples of implementation-friendly BLS curves through several high-security levels.
Resumo:
The article discusses the issues of resistance; that is resistance by prisoners to the various manifestations of power operating in high security prisons, as well as that of attempted shifts in the regime from physical to psychological control. Other topics highlighted include legitimacy and 'official discourse', mourning and the construction of 'ungrievable lives' and the importance of finding a way out of the cycle of violence, which high security regimes perpetuate.
Resumo:
“Supermax” prisons, conceived by the United States in the early 1980s, are typically reserved for convicted political criminals such as terrorists and spies and for other inmates who are considered to pose a serious ongoing threat to the wider community, to the security of correctional institutions, or to the safety of other inmates. Prisoners are usually restricted to their cells for up to twenty-three hours a day and typically have minimal contact with other inmates and correctional staff. Not only does the Federal Bureau of Prisons operate one of these facilities, but almost every state has either a supermax wing or stand-alone supermax prison. The Globalization of Supermax Prisons examines why nine advanced industrialized countries have adopted the supermax prototype, paying particular attention to the economic, social, and political processes that have affected each state. Featuring essays that look at the U.S.-run prisons of Abu Ghraib and Guantanemo, this collection seeks to determine if the American model is the basis for the establishment of these facilities and considers such issues as the support or opposition to the building of a supermax and why opposition efforts failed; the allegation of human rights abuses within these prisons; and the extent to which the decision to build a supermax was influenced by developments in the United States. Additionally, contributors address such domestic matters as the role of crime rates, media sensationalism, and terrorism in each country’s decision to build a supermax prison.
Resumo:
This research used the Queensland Police Service, Australia, as a major case study. Information on principles, techniques and processes used, and the reason for the recording, storing and release of audit information for evidentiary purposes is reported. It is shown that Law Enforcement Agencies have a two-fold interest in, and legal obligation pertaining to, audit trails. The first interest relates to the situation where audit trails are actually used by criminals in the commission of crime and the second to where audit trails are generated by the information systems used by the police themselves in support of the recording and investigation of crime. Eleven court cases involving Queensland Police Service audit trails used in evidence in Queensland courts were selected for further analysis. It is shown that, of the cases studied, none of the evidence presented was rejected or seriously challenged from a technical perspective. These results were further analysed and related to normal requirements for trusted maintenance of audit trail information in sensitive environments with discussion on the ability and/or willingness of courts to fully challenge, assess or value audit evidence presented. Managerial and technical frameworks for firstly what is considered as an environment where a computer system may be considered to be operating “properly” and, secondly, what aspects of education, training, qualifications, expertise and the like may be considered as appropriate for persons responsible within that environment, are both proposed. Analysis was undertaken to determine if audit and control of information in a high security environment, such as law enforcement, could be judged as having improved, or not, in the transition from manual to electronic processes. Information collection, control of processing and audit in manual processes used by the Queensland Police Service, Australia, in the period 1940 to 1980 was assessed against current electronic systems essentially introduced to policing in the decades of the 1980s and 1990s. Results show that electronic systems do provide for faster communications with centrally controlled and updated information readily available for use by large numbers of users who are connected across significant geographical locations. However, it is clearly evident that the price paid for this is a lack of ability and/or reluctance to provide improved audit and control processes. To compare the information systems audit and control arrangements of the Queensland Police Service with other government departments or agencies, an Australia wide survey was conducted. Results of the survey were contrasted with the particular results of a survey, conducted by the Australian Commonwealth Privacy Commission four years previous, to this survey which showed that security in relation to the recording of activity against access to information held on Australian government computer systems has been poor and a cause for concern. However, within this four year period there is evidence to suggest that government organisations are increasingly more inclined to generate audit trails. An attack on the overall security of audit trails in computer operating systems was initiated to further investigate findings reported in relation to the government systems survey. The survey showed that information systems audit trails in Microsoft Corporation's “Windows” operating system environments are relied on quite heavily. An audit of the security for audit trails generated, stored and managed in the Microsoft “Windows 2000” operating system environment was undertaken and compared and contrasted with similar such audit trail schemes in the “UNIX” and “Linux” operating systems. Strength of passwords and exploitation of any security problems in access control were targeted using software tools that are freely available in the public domain. Results showed that such security for the “Windows 2000” system is seriously flawed and the integrity of audit trails stored within these environments cannot be relied upon. An attempt to produce a framework and set of guidelines for use by expert witnesses in the information technology (IT) profession is proposed. This is achieved by examining the current rules and guidelines related to the provision of expert evidence in a court environment, by analysing the rationale for the separation of distinct disciplines and corresponding bodies of knowledge used by the Medical Profession and Forensic Science and then by analysing the bodies of knowledge within the discipline of IT itself. It is demonstrated that the accepted processes and procedures relevant to expert witnessing in a court environment are transferable to the IT sector. However, unlike some discipline areas, this analysis has clearly identified two distinct aspects of the matter which appear particularly relevant to IT. These two areas are; expertise gained through the application of IT to information needs in a particular public or private enterprise; and expertise gained through accepted and verifiable education, training and experience in fundamental IT products and system.
Resumo:
Operators of busy contemporary airports have to balance tensions between the timely flow of passengers, flight operations, the conduct of commercial business activities and the effective application of security processes. In addition to specific onsite issues airport operators liaise with a range of organisations which set and enforce aviation-related policies and regulations as well as border security agencies responsible for customs, quarantine and immigration, in addition to first response security services. The challenging demands of coordinating and planning in such complex socio-technical contexts place considerable pressure on airport management to facilitate coordination of what are often conflicting goals and expectations among groups that have standing in respect to safe and secure air travel. What are, as yet, significantly unexplored issues in large airports are options for the optimal coordination of efforts from the range of public and private sector participants active in airport security and crisis management. A further aspect of this issue is how airport management systems operate when there is a transition from business-as-usual into an emergency/crisis situation and then, on recovery, back to ‘normal’ functioning. Business Continuity Planning (BCP), incorporating sub-plans for emergency response, continuation of output and recovery of degraded operating capacity, would fit such a context. The implementation of BCP practices in such a significant high security setting offers considerable potential benefit yet entails considerable challenges. This paper presents early results of a 4 year nationally funded industry-based research project examining the merger of Business Continuity Planning and Transport Security Planning as a means of generating capability for improved security and reliability and, ultimately, enhanced resilience in major airports. The project is part of a larger research program on the Design of Secure Airports that includes most of the gazetted ‘first response’ international airports in Australia, key Aviation industry groups and all aviation-related border and security regulators as collaborative partners. The paper examines a number of initial themes in the research, including: ? Approaches to integrating Business Continuity & Aviation Security Planning within airport operations; ? Assessment of gaps in management protocols and operational capacities for identifying and responding to crises within and across critical aviation infrastructure; ? Identification of convergent and divergent approaches to crisis management used across Austral-Asia and their alignment to planned and possible infrastructure evolution.
Resumo:
A security system based on the recognition of the iris of human eyes using the wavelet transform is presented. The zero-crossings of the wavelet transform are used to extract the unique features obtained from the grey-level profiles of the iris. The recognition process is performed in two stages. The first stage consists of building a one-dimensional representation of the grey-level profiles of the iris, followed by obtaining the wavelet transform zerocrossings of the resulting representation. The second stage is the matching procedure for iris recognition. The proposed approach uses only a few selected intermediate resolution levels for matching, thus making it computationally efficient as well as less sensitive to noise and quantisation errors. A normalisation process is implemented to compensate for size variations due to the possible changes in the camera-to-face distance. The technique has been tested on real images in both noise-free and noisy conditions. The technique is being investigated for real-time implementation, as a stand-alone system, for access control to high-security areas.
Resumo:
This paper introduces our dedicated authenticated encryption scheme ICEPOLE. ICEPOLE is a high-speed hardware-oriented scheme, suitable for high-throughput network nodes or generally any environment where specialized hardware (such as FPGAs or ASICs) can be used to provide high data processing rates. ICEPOLE-128 (the primary ICEPOLE variant) is very fast. On the modern FPGA device Virtex 6, a basic iterative architecture of ICEPOLE reaches 41 Gbits/s, which is over 10 times faster than the equivalent implementation of AES-128-GCM. The throughput-to-area ratio is also substantially better when compared to AES-128-GCM. We have carefully examined the security of the algorithm through a range of cryptanalytic techniques and our findings indicate that ICEPOLE offers high security level.
Resumo:
The most powerful known primitive in public-key cryptography is undoubtedly elliptic curve pairings. Upon their introduction just over ten years ago the computation of pairings was far too slow for them to be considered a practical option. This resulted in a vast amount of research from many mathematicians and computer scientists around the globe aiming to improve this computation speed. From the use of modern results in algebraic and arithmetic geometry to the application of foundational number theory that dates back to the days of Gauss and Euler, cryptographic pairings have since experienced a great deal of improvement. As a result, what was an extremely expensive computation that took several minutes is now a high-speed operation that takes less than a millisecond. This thesis presents a range of optimisations to the state-of-the-art in cryptographic pairing computation. Both through extending prior techniques, and introducing several novel ideas of our own, our work has contributed to recordbreaking pairing implementations.
Resumo:
Introduction: Work engagement is a recent application of positive psychology and refers to a positive, fulfilling, work-related state of mind characterized by vigor, dedication and absorption. Despite theoretical assumptions, there is little published research on work engagement, due primarily to its recent emergence as a psychological construct. Furthermore, examining work engagement among high-stress occupations, such as police, is useful because police officers are generally characterized as having a high level of work engagement. Previous research has identified job resources (e.g. social support) as antecedents of work engagement. However detailed evaluation of job demands as an antecedent of work engagement within high-stress occupations has been scarce. Thus our second aim was to test job demands (i.e. monitoring demands and problem-solving demands) and job resources (i.e. time control, method control, supervisory support, colleague support, and friend and family support) as antecedents of work engagement among police officers. Method: Data were collected via a self-report online survey from one Australian state police service (n = 1,419). Due to the high number of hypothesized antecedent variables, hierarchical multiple regression analysis was employed rather than structural equation modelling. Results: Work engagement reported by police officers was high. Female officers had significantly higher levels of work engagement than male officers, while officers at mid-level ranks (sergeant) reported the lowest levels of work engagement. Job resources (method control, supervisor support and colleague support) were significant antecedents of three dimensions of work engagement. Only monitoring demands were significant antecedent of the absorption. Conclusion: Having healthy and engaged police officers is important for community security and economic growth. This study identified some common factors which influence work engagement experienced by police officers. However, we also note that excessive work engagement can yield negative outcomes such as psychological distress.
Resumo:
This paper describes the gaps in monitoring and surveillance identified while conducting Community Food Security assessments in three geographical areas located in south-east Queensland, Australia
Resumo:
Research on efficient pairing implementation has focussed on reducing the loop length and on using high-degree twists. Existence of twists of degree larger than 2 is a very restrictive criterion but luckily constructions for pairing-friendly elliptic curves with such twists exist. In fact, Freeman, Scott and Teske showed in their overview paper that often the best known methods of constructing pairing-friendly elliptic curves over fields of large prime characteristic produce curves that admit twists of degree 3, 4 or 6. A few papers have presented explicit formulas for the doubling and the addition step in Miller’s algorithm, but the optimizations were all done for the Tate pairing with degree-2 twists, so the main usage of the high- degree twists remained incompatible with more efficient formulas. In this paper we present efficient formulas for curves with twists of degree 2, 3, 4 or 6. These formulas are significantly faster than their predecessors. We show how these faster formulas can be applied to Tate and ate pairing variants, thereby speeding up all practical suggestions for efficient pairing implementations over fields of large characteristic.
Resumo:
High-rate flooding attacks (aka Distributed Denial of Service or DDoS attacks) continue to constitute a pernicious threat within the Internet domain. In this work we demonstrate how using packet source IP addresses coupled with a change-point analysis of the rate of arrival of new IP addresses may be sufficient to detect the onset of a high-rate flooding attack. Importantly, minimizing the number of features to be examined, directly addresses the issue of scalability of the detection process to higher network speeds. Using a proof of concept implementation we have shown how pre-onset IP addresses can be efficiently represented using a bit vector and used to modify a “white list” filter in a firewall as part of the mitigation strategy.
Resumo:
Several studies have developed metrics for software quality attributes of object-oriented designs such as reusability and functionality. However, metrics which measure the quality attribute of information security have received little attention. Moreover, existing security metrics measure either the system from a high level (i.e. the whole system’s level) or from a low level (i.e. the program code’s level). These approaches make it hard and expensive to discover and fix vulnerabilities caused by software design errors. In this work, we focus on the design of an object-oriented application and define a number of information security metrics derivable from a program’s design artifacts. These metrics allow software designers to discover and fix security vulnerabilities at an early stage, and help compare the potential security of various alternative designs. In particular, we present security metrics based on composition, coupling, extensibility, inheritance, and the design size of a given object-oriented, multi-class program from the point of view of potential information flow.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.