946 resultados para security of title
Resumo:
"January 1985."
Resumo:
"February 26, 1991"--Pts. 2 and 3.
Resumo:
At head of title: 97th Congress, 1st session. Committee print. WMCP: 97-15.
Resumo:
This research used the Queensland Police Service, Australia, as a major case study. Information on principles, techniques and processes used, and the reason for the recording, storing and release of audit information for evidentiary purposes is reported. It is shown that Law Enforcement Agencies have a two-fold interest in, and legal obligation pertaining to, audit trails. The first interest relates to the situation where audit trails are actually used by criminals in the commission of crime and the second to where audit trails are generated by the information systems used by the police themselves in support of the recording and investigation of crime. Eleven court cases involving Queensland Police Service audit trails used in evidence in Queensland courts were selected for further analysis. It is shown that, of the cases studied, none of the evidence presented was rejected or seriously challenged from a technical perspective. These results were further analysed and related to normal requirements for trusted maintenance of audit trail information in sensitive environments with discussion on the ability and/or willingness of courts to fully challenge, assess or value audit evidence presented. Managerial and technical frameworks for firstly what is considered as an environment where a computer system may be considered to be operating “properly” and, secondly, what aspects of education, training, qualifications, expertise and the like may be considered as appropriate for persons responsible within that environment, are both proposed. Analysis was undertaken to determine if audit and control of information in a high security environment, such as law enforcement, could be judged as having improved, or not, in the transition from manual to electronic processes. Information collection, control of processing and audit in manual processes used by the Queensland Police Service, Australia, in the period 1940 to 1980 was assessed against current electronic systems essentially introduced to policing in the decades of the 1980s and 1990s. Results show that electronic systems do provide for faster communications with centrally controlled and updated information readily available for use by large numbers of users who are connected across significant geographical locations. However, it is clearly evident that the price paid for this is a lack of ability and/or reluctance to provide improved audit and control processes. To compare the information systems audit and control arrangements of the Queensland Police Service with other government departments or agencies, an Australia wide survey was conducted. Results of the survey were contrasted with the particular results of a survey, conducted by the Australian Commonwealth Privacy Commission four years previous, to this survey which showed that security in relation to the recording of activity against access to information held on Australian government computer systems has been poor and a cause for concern. However, within this four year period there is evidence to suggest that government organisations are increasingly more inclined to generate audit trails. An attack on the overall security of audit trails in computer operating systems was initiated to further investigate findings reported in relation to the government systems survey. The survey showed that information systems audit trails in Microsoft Corporation's “Windows” operating system environments are relied on quite heavily. An audit of the security for audit trails generated, stored and managed in the Microsoft “Windows 2000” operating system environment was undertaken and compared and contrasted with similar such audit trail schemes in the “UNIX” and “Linux” operating systems. Strength of passwords and exploitation of any security problems in access control were targeted using software tools that are freely available in the public domain. Results showed that such security for the “Windows 2000” system is seriously flawed and the integrity of audit trails stored within these environments cannot be relied upon. An attempt to produce a framework and set of guidelines for use by expert witnesses in the information technology (IT) profession is proposed. This is achieved by examining the current rules and guidelines related to the provision of expert evidence in a court environment, by analysing the rationale for the separation of distinct disciplines and corresponding bodies of knowledge used by the Medical Profession and Forensic Science and then by analysing the bodies of knowledge within the discipline of IT itself. It is demonstrated that the accepted processes and procedures relevant to expert witnessing in a court environment are transferable to the IT sector. However, unlike some discipline areas, this analysis has clearly identified two distinct aspects of the matter which appear particularly relevant to IT. These two areas are; expertise gained through the application of IT to information needs in a particular public or private enterprise; and expertise gained through accepted and verifiable education, training and experience in fundamental IT products and system.
Resumo:
Refactoring focuses on improving the reusability, maintainability and performance of programs. However, the impact of refactoring on the security of a given program has received little attention. In this work, we focus on the design of object-oriented applications and use metrics to assess the impact of a number of standard refactoring rules on their security by evaluating the metrics before and after refactoring. This assessment tells us which refactoring steps can increase the security level of a given program from the point of view of potential information flow, allowing application designers to improve their system’s security at an early stage.
Resumo:
Refactoring is a common approach to producing better quality software. Its impact on many software quality properties, including reusability, maintainability and performance, has been studied and measured extensively. However, its impact on the information security of programs has received relatively little attention. In this work, we assess the impact of a number of the most common code-level refactoring rules on data security, using security metrics that are capable of measuring security from the viewpoint of potential information flow. The metrics are calculated for a given Java program using a static analysis tool we have developed to automatically analyse compiled Java bytecode. We ran our Java code analyser on various programs which were refactored according to each rule. New values of the metrics for the refactored programs then confirmed that the code changes had a measurable effect on information security.
Resumo:
Australia has new national legislation - the Personal Property Securities Act 2009 (Cth) and the Personal Property Securities Regulations 2010 – which commenced operation on 30 January 2012. The policy objectives of the new legislation are to increase certainty and consistency and to reduce complexity and cost. To achieve this, the legislation treats like transactions alike, by focusing on substance over form, and so removes distinctions between security interests which have been based on their structure. Differences based on the location or nature of the secured property and the debtor’s legal form, as an individual or company, have also disappeared. We now have one single national scheme and one national electronic registration system for all security interests throughout Australia. The Act applies to security interests in tangible and intangible personal property, including those based on some form of title retention which are not security interests under the general law. This legislation rationalises previous laws and bring about substantial changes to this area of law. This paper seeks to explain the principal changes and their implications.
Resumo:
Non-linear feedback shift register (NLFSR) ciphers are cryptographic tools of choice of the industry especially for mobile communication. Their attractive feature is a high efficiency when implemented in hardware or software. However, the main problem of NLFSR ciphers is that their security is still not well investigated. The paper makes a progress in the study of the security of NLFSR ciphers. In particular, we show a distinguishing attack on linearly filtered NLFSR (or LF-NLFSR) ciphers. We extend the attack to a linear combination of LF-NLFSRs. We investigate the security of a modified version of the Grain stream cipher and show its vulnerability to both key recovery and distinguishing attacks.
Resumo:
“Supermax” prisons, conceived by the United States in the early 1980s, are typically reserved for convicted political criminals such as terrorists and spies and for other inmates who are considered to pose a serious ongoing threat to the wider community, to the security of correctional institutions, or to the safety of other inmates. Prisoners are usually restricted to their cells for up to twenty-three hours a day and typically have minimal contact with other inmates and correctional staff. Not only does the Federal Bureau of Prisons operate one of these facilities, but almost every state has either a supermax wing or stand-alone supermax prison. The Globalization of Supermax Prisons examines why nine advanced industrialized countries have adopted the supermax prototype, paying particular attention to the economic, social, and political processes that have affected each state. Featuring essays that look at the U.S.-run prisons of Abu Ghraib and Guantanemo, this collection seeks to determine if the American model is the basis for the establishment of these facilities and considers such issues as the support or opposition to the building of a supermax and why opposition efforts failed; the allegation of human rights abuses within these prisons; and the extent to which the decision to build a supermax was influenced by developments in the United States. Additionally, contributors address such domestic matters as the role of crime rates, media sensationalism, and terrorism in each country’s decision to build a supermax prison.
Resumo:
Private title insurance has been the subject of much debate by law reform bodies and academics. This article adds a new dimension to the discussion by analysing its role against a recent scenario where a nun was betrayed by the actions of her brother, and compensation payable from the assurance fund, after much challenge by the registrar, amounted to in excess of $4 million.We ask whether the slow burning of title insurance into the psyche of Australian home purchasers will see state-based assurance fundings looking to minismise their role in the Torrens system. We also query how the rather more immediate electronic establishment of electronic conveyancing will alter the balance between the assurance fund, private title insurance and the increasing responsibilities on stakeholdes involved in conveyancing.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
This thesis evaluates the security of Supervisory Control and Data Acquisition (SCADA) systems, which are one of the key foundations of many critical infrastructures. Specifically, it examines one of the standardised SCADA protocols called the Distributed Network Protocol Version 3, which attempts to provide a security mechanism to ensure that messages transmitted between devices, are adequately secured from rogue applications. To achieve this, the thesis applies formal methods from theoretical computer science to formally analyse the correctness of the protocol.
Resumo:
Electric power systems are exposed to various contingencies. Network contingencies often contribute to over-loading of network branches, unsatisfactory voltages and also leading to problems of stability/voltage collapse. To maintain security of the systems, it is desirable to estimate the effect of contingencies and plan suitable measures to improve system security/stability. This paper presents an approach for selection of unified power flow controller (UPFC) suitable locations considering normal and network contingencies after evaluating the degree of severity of the contingencies. The ranking is evaluated using composite criteria based fuzzy logic for eliminating masking effect. The fuzzy approach, in addition to real power loadings and bus voltage violations, voltage stability indices at the load buses also used as the post-contingent quantities to evaluate the network contingency ranking. The selection of UPFC suitable locations uses the criteria on the basis of improved system security/stability. The proposed approach for selection of UPFC suitable locations has been tested under simulated conditions on a few power systems and the results for a 24-node real-life equivalent EHV power network and 39-node New England (modified) test system are presented for illustration purposes.
Resumo:
The notions about women been limited to producing children like rates alone; cooking for the family, restricted within the fenced compound without any meaningful contributions in fish food production was dispelled during the course of the study. From the data gathered, the study revealed various contributions of women as regard fish food production as about 2% of the women are involved in direct fishing as this enhance food security of the family and the society. Also women dominate the entire post harvest and marketing sector and 70% started fishing business with their personal savings.Also, some of the women own boats and other fishing inputs, which they do give to the fishermen that could catch the fish and sell it to them. This has a way of enhancing fish catch and fish food security of the people as those men that would have sit idle for lack of fishing gears are now meaningfully engaged courtesy of the women financiers. Finally, the study also revealed that 46% of the women between N2,5000 to above N4,000 from marketing of fish, and also utilizenthe income generated to enhance the welfare of the households in the area of food, clothing and paying their children school fees hence reducing the level of poverty of their households
Resumo:
Interpolation attack was presented by Jakobsen and Knudsen at FSE'97. Interpolation attack is effective against ciphers that have a certain algebraic structure like the PURE cipher which is a prototype cipher, but it is difficult to apply the attack to real-world ciphers. This difficulty is due to the difficulty of deriving a low degree polynomial relation between ciphertexts and plaintexts. In other words, it is difficult to evaluate the security against interpolation attack. This paper generalizes the interpolation attack. The generalization makes easier to evaluate the security against interpolation attack. We call the generalized interpolation attack linear sum attack. We present an algorithm that evaluates the security of byte-oriented ciphers against linear sum attack. Moreover, we show the relationship between linear sum attack and higher order differential attack. In addition, we show the security of CRYPTON, E2, and RIJNDAEL against linear sum attack using the algorithm.