900 resultados para Computer Security
Resumo:
With increasing interest shown by Universities in workplace learning, especially in STEM disciplines, an issue has arisen amongst educators and industry partners regarding authentic assessment tasks for work integrated learning (WIL) subjects. This paper describes the use of a matrix, which is also available as a decision-tree, based on the features of the WIL experience, in order to facilitate the selection of appropriate assessment strategies. The matrix divides the WIL experiences into seven categories, based on such factors as: the extent to which the experience is compulsory, required for membership of a professional body or elective; whether the student is undertaking a project, or embedding in a professional culture; and other key aspects of the WIL experience. One important variable is linked to the fundamental purpose of the assessment. This question revolves around the focus of the assessment: whether on the person (student development); the process (professional conduct/language); or the product (project, assignment, literature review, report, software). The matrix has been trialed at QUT in the Faculty of Science and Technology, and also at the University of Surrey, UK, and has proven to have good applicability in both universities.
Resumo:
Mainstream discourse on the revolving around food security is often portrayed by macro level indicators on nutrition, consumption and food production. While these indicators may prove significant in addressing food security in the national and regional levels, it falls short in addressing it among the indigenous peoples’ (IP) communities in the Philippines. Reflecting through the experiences in agricultural production, indigenous knowledge and socio-political institutions are relevant factors that must be seriously considered when food security among IPs are concerned. It is argued that disregarding micro level interactions over macro development policies will not address the issue of food security among marginalized sectors. The paper presents policy recommendations in taking cultural systems seriously in addressing food security among indigenous peoples.
Resumo:
The purpose of the current study was to develop a measurement of information security culture in developing countries such as Saudi Arabia. In order to achieve this goal, the study commenced with a comprehensive review of the literature, the outcome being the development of a conceptual model as a reference base. The literature review revealed a lack of academic and professional research into information security culture in developing countries and more specifically in Saudi Arabia. Given the increasing importance and significant investment developing countries are making in information technology, there is a clear need to investigate information security culture from developing countries perspective such as Saudi Arabia. Furthermore, our analysis indicated a lack of clear conceptualization and distinction between factors that constitute information security culture and factors that influence information security culture. Our research aims to fill this gap by developing and validating a measurement model of information security culture, as well as developing initial understanding of factors that influence security culture. A sequential mixed method consisting of a qualitative phase to explore the conceptualisation of information security culture, and a quantitative phase to validate the model is adopted for this research. In the qualitative phase, eight interviews with information security experts in eight different Saudi organisations were conducted, revealing that security culture can be constituted as reflection of security awareness, security compliance and security ownership. Additionally, the qualitative interviews have revealed that factors that influence security culture are top management involvement, policy enforcement, policy maintenance, training and ethical conduct policies. These factors were confirmed by the literature review as being critical and important for the creation of security culture and formed the basis for our initial information security culture model, which was operationalised and tested in different Saudi Arabian organisations. Using data from two hundred and fifty-four valid responses, we demonstrated the validity and reliability of the information security culture model through Exploratory Factor Analysis (EFA), followed by Confirmatory Factor Analysis (CFA.) In addition, using Structural Equation Modelling (SEM) we were further able to demonstrate the validity of the model in a nomological net, as well as provide some preliminary findings on the factors that influence information security culture. The current study contributes to the existing body of knowledge in two major ways: firstly, it develops an information security culture measurement model; secondly, it presents empirical evidence for the nomological validity for the security culture measurement model and discovery of factors that influence information security culture. The current study also indicates possible future related research needs.
Resumo:
A5/1 is a shift register based stream cipher which provides privacy for the GSM system. In this paper, we analyse the loading of the secret key and IV during the initialisation process of A5/1. We demonstrate the existence of weak key-IV pairs in the A5/1 cipher due to this loading process; these weak key-IV pairs may generate one, two or three registers containing all-zero values, which may lead in turn to weak keystream sequences. In the case where two or three registers contain only zeros, we describe a distinguisher which leads to a complete decryption of the affected messages.
Resumo:
This paper describes the development of an analytical model used to simulate the fatigue behaviour of roof cladding during the passage of a tropical cyclone. The model incorporated into a computer program uses wind pressure data from wind tunnel tests in combination with time history information on wind speed and direction during a tropical cyclone, and experimental fatigue characteristics data of roof claddings. The wind pressure data is analysed using a rainflow form of analysis, and a fatigue damage index calculated using a modified form of Miner's rule. Some of the results obtained to date and their significance in relation to the review of current fatigue tests are presented. The model appears to be reasonable for comparative estimation of fatigue life, but an improvement of Miner's rule is required for the prediction of actual fatigue life.
Resumo:
GO423 was initiated in 2012 as part of a community effort to ensure the vitality of the Queensland Games Sector. In common with other industrialised nations, the game industry in Australia is a reasonably significant contributor to Gross National Product (GNP). Games are played in 92% of Australian homes and the average adult player has been playing them for at least twelve years with 26% playing for more than thirty years (Brand, 2011). Like the games and interactive entertainment industries in other countries, the Australian industry has its roots in the small team model of the 1980s. So, for example, Beam Software, which was established in Melbourne in 1980, was started by two people and Krome Studios was started in 1999 by three. Both these companies grew to employing over 100 people in their heydays (considered large by Antipodean standards), not by producing their own intellectual property (IP) but by content generation for off shore parent companies. Thus our bigger companies grew on a model of service provision and tended not to generate their own IP (Darchen, 2012). There are some no-table exceptions where IP has originated locally and been ac-quired by international companies but in the case of some of the works of which we are most proud, the Australian company took on the role of “Night Elf” – a convenience due to affordances of the time zone which allowed our companies to work while the parent companies slept in a different time zone. In the post GFC climate, the strong Australian dollar and the vulnerability of such service provision means that job security is virtually non-existent with employees invariably being on short-term contracts. These issues are exacerbated by the decline of middle-ground games (those which fall between the triple-A titles and the smaller games often produced for a casual audience). The response to this state of affairs has been the change in the Australian games industry to new recognition of its identity as a wider cultural sector and the rise (or return) of an increasing number of small independent game development companies. ’In-dies’ consist of small teams, often making games for mobile and casual platforms, that depend on producing at least one if not two games a year and who often explore more radical definitions of games as designed cultural objects. The need for innovation and creativity in the Australian context is seen as a vital aspect of the current changing scene where we see the emphasis on the large studio production model give way to an emerging cultural sector model where small independent teams are engaged in shorter design and production schedules driven by digital distribution. In terms of Quality of Life (QoL) this new digital distribution brings with it the danger of 'digital isolation' - a studio can work from home and deliver from home. Community events thus become increasingly important. The GO423 Symposium is a response to these perceived needs and the event is based on the understanding that our new small creative teams depend on the local community of practice in no small way. GO423 thus offers local industry participants the opportunity to talk to each other about their work, to talk to potential new members about their work and to show off their work in a small intimate situation, encouraging both feedback and support.
Resumo:
This paper presents a new framework for distributed intrusion detection based on taint marking. Our system tracks information flows between applications of multiple hosts gathered in groups (i.e., sets of hosts sharing the same distributed information flow policy) by attaching taint labels to system objects such as files, sockets, Inter Process Communication (IPC) abstractions, and memory mappings. Labels are carried over the network by tainting network packets. A distributed information flow policy is defined for each group at the host level by labeling information and defining how users and applications can legally access, alter or transfer information towards other trusted or untrusted hosts. As opposed to existing approaches, where information is most often represented by two security levels (low/high, public/private, etc.), our model identifies each piece of information within a distributed system, and defines their legal interaction in a fine-grained manner. Hosts store and exchange security labels in a peer to peer fashion, and there is no central monitor. Our IDS is implemented in the Linux kernel as a Linux Security Module (LSM) and runs standard software on commodity hardware with no required modification. The only trusted code is our modified operating system kernel. We finally present a scenario of intrusion in a web service running on multiple hosts, and show how our distributed IDS is able to report security violations at each host level.
Resumo:
This paper presents a model for the generation of a MAC tag using a stream cipher. The input message is used indirectly to control segments of the keystream that form the MAC tag. Several recent proposals can be considered as instances of this general model, as they all perform message accumulation in this way. However, they use slightly different processes in the message preparation and finalisation phases. We examine the security of this model for different options and against different types of attack, and conclude that the indirect injection model can be used to generate MAC tags securely for certain combinations of options. Careful consideration is required at the design stage to avoid combinations of options that result in susceptibility to forgery attacks. Additionally, some implementations may be vulnerable to side-channel attacks if used in Authenticated Encryption (AE) algorithms. We give design recommendations to provide resistance to these attacks for proposals following this model.
Resumo:
Trivium is a bit-based stream cipher in the final portfolio of the eSTREAM project. In this paper, we apply the algebraic attack approach of Berbain et al. to Trivium-like ciphers and perform new analyses on them. We demonstrate a new algebraic attack on Bivium-A. This attack requires less time and memory than previous techniques to recover Bivium-A's initial state. Though our attacks on Bivium-B, Trivium and Trivium-N are worse than exhaustive keysearch, the systems of equations which are constructed are smaller and less complex compared to previous algebraic analyses. We also answer an open question posed by Berbain et al. on the feasibility of applying their technique on Trivium-like ciphers. Factors which can affect the complexity of our attack on Trivium-like ciphers are discussed in detail. Analysis of Bivium-B and Trivium-N are omitted from this manuscript. The full paper is available on the IACR ePrint Archive.
Resumo:
Microvessel density (MVD) is a widely used surrogate measure of angiogenesis in pathological specimens and tumour models. Measurement of MVD can be achieved by several methods. Automation of counting methods aims to increase the speed, reliability and reproducibility of these techniques. The image analysis system described here enables MVD measurement to be carried out with minimal expense in any reasonably equipped pathology department or laboratory. It is demonstrated that the system translates easily between tumour types which are suitably stained with minimal calibration. The aim of this paper is to offer this technique to a wider field of researchers in angiogenesis.
Resumo:
The notion of plaintext awareness ( PA ) has many applications in public key cryptography: it offers unique, stand-alone security guarantees for public key encryption schemes, has been used as a sufficient condition for proving indistinguishability against adaptive chosen-ciphertext attacks ( IND-CCA ), and can be used to construct privacy-preserving protocols such as deniable authentication. Unlike many other security notions, plaintext awareness is very fragile when it comes to differences between the random oracle and standard models; for example, many implications involving PA in the random oracle model are not valid in the standard model and vice versa. Similarly, strategies for proving PA of schemes in one model cannot be adapted to the other model. Existing research addresses PA in detail only in the public key setting. This paper gives the first formal exploration of plaintext awareness in the identity-based setting and, as initial work, proceeds in the random oracle model. The focus is laid mainly on identity-based key encapsulation mechanisms (IB-KEMs), for which the paper presents the first definitions of plaintext awareness, highlights the role of PA in proof strategies of IND-CCA security, and explores relationships between PA and other security properties. On the practical side, our work offers the first, highly efficient, general approach for building IB-KEMs that are simultaneously plaintext-aware and IND-CCA -secure. Our construction is inspired by the Fujisaki-Okamoto (FO) transform, but demands weaker and more natural properties of its building blocks. This result comes from a new look at the notion of γ -uniformity that was inherent in the original FO transform. We show that for IB-KEMs (and PK-KEMs), this assumption can be replaced with a weaker computational notion, which is in fact implied by one-wayness. Finally, we give the first concrete IB-KEM scheme that is PA and IND-CCA -secure by applying our construction to a popular IB-KEM and optimizing it for better performance.
Resumo:
The relationship between coronal knee laxity and the restraining properties of the collateral ligaments remains unknown. This study investigated correlations between the structural properties of the collateral ligaments and stress angles used in computer-assisted total knee arthroplasty (TKA), measured with an optically based navigation system. Ten fresh-frozen cadaveric knees (mean age: 81 ± 11 years) were dissected to leave the menisci, cruciate ligaments, posterior joint capsule and collateral ligaments. The resected femur and tibia were rigidly secured within a test system which permitted kinematic registration of the knee using a commercially available image-free navigation system. Frontal plane knee alignment and varus-valgus stress angles were acquired. The force applied during varus-valgus testing was quantified. Medial and lateral bone-collateral ligament-bone specimens were then prepared, mounted within a uni-axial materials testing machine, and extended to failure. Force and displacement data were used to calculate the principal structural properties of the ligaments. The mean varus laxity was 4 ± 1° and the mean valgus laxity was 4 ± 2°. The corresponding mean manual force applied was 10 ± 3 N and 11 ± 4 N, respectively. While measures of knee laxity were independent of the ultimate tensile strength and stiffness of the collateral ligaments, there was a significant correlation between the force applied during stress testing and the instantaneous stiffness of the medial (r = 0.91, p = 0.001) and lateral (r = 0.68, p = 0.04) collateral ligaments. These findings suggest that clinicians may perceive a rate of change of ligament stiffness as the end-point during assessment of collateral knee laxity.
Resumo:
In this paper we will examine passenger actions and activities at the security screening points of Australian domestic and international airports. Our findings and analysis provide a more complete understanding of the current airport passenger security screening experience. Data in this paper is comprised of field studies conducted at two Australian airports, one domestic and one international. Video data was collected by cameras situated either side of the security screening point. A total of one hundred and ninety-six passengers were observed. Two methods of analysis are used. First, the activities of passengers are coded and analysed to reveal the common activities at domestic and international security regimes and between quiet and busy periods. Second, observation of passenger activities is used to reveal uncommon aspects. The results show that passengers do more at security screening that being passively scanned. Passengers queue, unpack the required items from their bags and from their pockets, walk through the metal-detector, re-pack and occasionally return to be re-screened. For each of these activities, passengers must understand the procedures at the security screening point and must co-ordinate various actions and objects in time and space. Through this coordination passengers are active participants in making the security checkpoint function – they are co-producers of the security screening process.
Resumo:
Novel computer vision techniques have been developed for automatic monitoring of crowed environments such as airports, railway stations and shopping malls. Using video feeds from multiple cameras, the techniques enable crowd counting, crowd flow monitoring, queue monitoring and abnormal event detection. The outcome of the research is useful for surveillance applications and for obtaining operational metrics to improve business efficiency.