251 resultados para Entity Authentication


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identity-Based (IB) cryptography is a rapidly emerging approach to public-key cryptography that does not require principals to pre-compute key pairs and obtain certificates for their public keys— instead, public keys can be arbitrary identifiers such as email addresses, while private keys are derived at any time by a trusted private key generator upon request by the designated principals. Despite the flurry of recent results on IB encryption and signature, some questions regarding the security and efficiency of practicing IB encryption (IBE) and signature (IBS) as a joint IB signature/encryption (IBSE) scheme with a common set of parameters and keys, remain unanswered. We first propose a stringent security model for IBSE schemes. We require the usual strong security properties of: (for confidentiality) indistinguishability against adaptive chosen-ciphertext attacks, and (for nonrepudiation) existential unforgeability against chosen-message insider attacks. In addition, to ensure as strong as possible ciphertext armoring, we also ask (for anonymity) that authorship not be transmitted in the clear, and (for unlinkability) that it remain unverifiable by anyone except (for authentication) by the legitimate recipient alone. We then present an efficient IBSE construction, based on bilinear pairings, that satisfies all these security requirements, and yet is as compact as pairing-based IBE and IBS in isolation. Our scheme is secure, compact, fast and practical, offers detachable signatures, and supports multirecipient encryption with signature sharing for maximum scalability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nanocomposites are recently known to be among the most successful materials in biomedical applications. In this work we sought to fabricate fibrous scaffolds which can mimic the extra cellular matrix of cartilaginous connective tissue not only to a structural extent but with a mechanical and biological analogy. Poly(3-hydroxybutyrate) (P3HB) matrices were reinforced with 5, 10 and 15 %wt hydroxyapatite (HA) nanoparticles and electrospun into nanocomposite fibrous scaffolds. Mechanical properties of each case were compared with that of a P3HB scaffold produced in the same processing condition. Spectroscopic and morphological observations were used for detecting the interaction quality between the constituents. Nanoparticles rested deep within the fibers of 1 μm in diameter. Chemical interactions of hydrogen bonds linked the constituents through the interface. Maximum elastic modulus and mechanical strength was obtained with the presence of 5%wt hydroxyapatite nanoparticles. Above 10%wt, nanoparticles tended to agglomerate and caused the entity to lose its mechanical performance; however, viscoelasticity interfered at this concentration and lead to a delayed failure. In other words, higher elongation at break and a massive work of rupture was observed at 10%wt.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Property in an elusive concept. In many respects it has been regarded as a source of authority to use, develop and make decisions about whatever is the subject matter of this right of ownership. This is true whether the holder of this right of ownership is a private entity or a public entity. Increasingly a right of ownership of this kind has been recognised not only as a source of authority but also as a mechanism for restricting or limiting and perhaps even prohibiting existing or proposed activities that impact upon the environment. It is increasingly therefore an instrument of regulation as much as an instrument of authorisation. The protection and conservation of the environment are ultimately a matter of the public interest. This is not to suggest that the individual holders of rights of ownership are not interested in protecting the environment. It is open to them to do so in the exercise of a right of ownership as a source of authorisation. However a right of ownership – whether private or public – has become increasingly the mechanism according to which the environment is protected and conserved through the use of rights of ownership as a means of regulation. This paper addressed these issues from a doctrinal as well as a practical perspective in how the environment is managed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australian airports have emerged as important urban activity centres over the past decade as a result of privatisation. A range of reciprocal airport and regional impacts now pose considerable challenges for both airport operation and the surrounding urban and regional environment. The airport can no longer be managed solely as a specialised transport entity in isolation from the metropolis that it serves. In 2007 a multidisciplinary Australian Research Council Linkage Project (LP 0775225) was funded to investigate the changing role of airports in Australia. This thesis is but one component of this collaborative research effort. Here the issues surrounding the policy and practice of airport and regional land use planning are explored, analysed and detailed. This research, for the first time, assembles a distinct progression of the wider social, economic, technological and environmental roles of the airport within the Australian airport literature from 1914 – 2011. It recognises that while the list of airport and regional impacts has grown through time, treatment within practice and the literature has largely remained highly specialised and contained within disciplinary paradigms. The first publication of the thesis (Chapter 2) acknowledges that the changing role of airports demands the establishment of new models of airport planning and development. It argues that practice and research requires a better understanding of the reciprocal impacts of airports and their urban catchments. The second publication (Chapter 3) highlights that there is ad hoc examination and media attention of high profile airport and regional conflict, but little empirical analysis or understanding of the extent to which all privatised Australian airports are intending to develop. The conceptual and methodological significance of this research is the development of a national land use classification system for on-airport development. This paper establishes the extent of on-airport development in Australia, providing insight into the changing land use and economic roles of privatised airports. The third publication (Chapter 4) details new and significant interdependencies for airport and regional development in consideration of the progression of airports as activity centres. Here the model of an ‘airport metropolis’ is offered as an organising device and theoretical contribution for comprehending the complexity and planning of airport and regional development. It delivers a conceptual framework for both research and policy, which acknowledges the reciprocal impacts of economic development, land use, infrastructure and governance ‘interfaces’. In a timely and significant concurrence with this research the Australian Government announced and delivered a National Aviation Policy Review (2008 – 2009). As such the fourth publication (Chapter 5) focuses on the airport and urban planning aspects of the review. This paper also highlights the overall policy intention of facilitating broader airport and regional collaborative processes. This communicative turn in airport policy is significant in light of the communicative theoretical framework of the thesis. The fifth paper of the thesis (Chapter 6) examines three Australian case studies (Brisbane, Adelaide and Canberra) to detail the context of airport and regional land use planning and to apply the airport metropolis model as a framework for research. Through the use of Land Use Forums, over 120 airport and regional stakeholders are brought together to detail their perspectives and interactions with airport and regional land use planning. An inductive thematic analysis of the results identifies three significant themes which contribute to the fragmentation of airport and regional and land use planning: 1) inadequate coordination and disjointed decision-making; 2) current legislative and policy frameworks; and 3) competing stakeholder priorities and interests. Building on this new knowledge, Chapter 7 details the perceptions of airport and local, state and territory government stakeholders to land use relationships, processes and outcomes. A series of semi-structured interviews are undertaken in each of the case studies to inform this research. The potential implications for ongoing communicative practice are discussed in conclusion. The following thesis represents an incremental and cumulative research process which delivers new knowledge for the practical understanding and research interpretation of airport and regional land use planning practice and policy. It has developed and applied a robust conceptual framework which delivers significant direction for all stakeholders to better comprehend the relevance of airports in the urban character and design of our cities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Creative Statement: “There are those who see Planet Earth as a gigantic living being, one that feeds and nurtures humanity and myriad other species – an entity that must be cared for. Then there are those who see it as a rock full of riches to be pilfered heedlessly in a short-term quest for over-abundance. This ‘cradle to grave’ mentality, it would seem, is taking its toll (unless you’re a virulent disbeliever in climate change). Why not, ask artists Priscilla Bracks and Gavin Sade, take a different approach? To this end they have set out on a near impossible task; to visualise the staggering quantity of carbon produced by Australia every year. Their eerie, glowing plastic cube resembles something straight out of Dr Who or The X Files. And, like the best science fiction, it has technical realities at its heart. Every One, Every Day tangibly illustrates our greenhouse gas output – its 27m3 volume is approximately the amount of green-house gas emitted per capita, daily. Every One, Every Dayis lit by an array of LED’s displaying light patterns representing energy use generated by data from the Australian Energy Market. Every One, Every Day was formed from recycled, polyethylene – used milk bottles – ‘lent’ to the artists by a Visy recycling facility. At the end of the Vivid Festival this plastic will be returned to Visy, where it will re-enter the stream of ‘technical nutrients.’ Could we make another world? One that emulates the continuing cycles of nature? One that uses our ‘technical nutrients’ such as plastic and steel in continual cycles, just like a deciduous tree dropping leaves to compost itself and keep it’s roots warm and moist?” (Ashleigh Crawford. Melbourne – April, 2013) Artistic Research Statement: The research focus of this work is on exploring how to represent complex statistics and data at a human scale, and how produce a work where a large percentage of the materials could be recycled. The surface of Every One, Every Day is clad in tiles made from polyethylene, from primarily recycled milk bottles, ‘lent’ to the artists by the Visy recycling facility in Sydney. The tiles will be returned to Visy for recycling. As such the work can be viewed as an intervention in the industrial ecology of polyethylene, and in the process demonstrates how to sustain cycles of technical materials – by taking the output of a recycling facility back to a manufacturer to produce usable materials. In terms of data visualisation, Every One, Every Day takes the form of a cube with a volume of 27 cubic meters. The annual per capita emissions figures for Australia are cited as ranging between 18 to 25 tons. Assuming the lower figure, 18tons per capital annually, the 27 cubic meters represents approximately one day per capita of CO2 emissions – where CO2 is a gas at 15C and 1 atmosphere of pressure. The work also explores real time data visualisation by using an array of 600 controllable LEDs inside the cube. Illumination patterns are derived from a real time data from the Australian Energy Market, using the dispatch interval price and demand graph for New South Wales. The two variables of demand and price are mapped to properties of the illumination - hue, brightness, movement, frequency etc. The research underpinning the project spanned industrial ecology to data visualization and public art practices. The result is that Every One, Every Day is one of the first public artworks that successfully bring together materials, physical form, and real time data representation in a unified whole.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An issue gaining prominence in our urban environments in the notion of lost space, the undesirable urban areas that are in need of redesign, commonly caused by a focus on development as individual architectural entities, without a greater view of the urban environment as a holistic entity. Within the context of South East Queensland, the suburb of Fortitude Valley has been earmarked for development as an extension of the current CBD. With lost and disused spaces already existing throughout the suburb due to rapid growth and mismatched developments, recent planning regimes have proposed rejuvenation in the form of proposals that echo typologies from other Australian regions, such as the laneway typology from Melbourne. Opportunities exist in these spaces for design approaches that relate specifically to the individual and unique subtropical character of the area. This research explores the relationship between innovative approaches towards urban greenery as a means to rejuvenate lost and disused public space, and its suitability within a subtropical climate, specifically focused within the suburb of Fortitude Valley. A trend gaining prominence is the notion of biophilic cities; cities that integrate urban greenery as a means to provide vibrant public spaces, and meet the growing aesthetic, social, cultural and economic needs of our cities. Through analysis of case studies showcasing greenery in an inventive way, observations of public using subtropical public space, and a discussion of the current policy frameworks at place within Fortitude Valley, innovative uses of urban greenery is proposed as viable placemaking technique in subtropical urban environments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported here addresses the problem of athlete off-field behaviours as they influence sports’ sponsors, particularly the achievement of sponsorship objectives. The question arises because of incidents of sponsorship contract cancellation following news-media reporting of athletes’ off-field behaviours. Two studies are used to investigate the research question; the first establishes the content of news-media reports, and the second tests the effects of news’ reports on athlete, team and sponsor evaluations using an experimental design. Key assumptions of the research are that sponsorship objectives are principally consumer-based and mediated. Models of sponsorship argue that sponsors aim to reach and influence consumers through sponsees. Assuming this pathway exists is central to sponsorship activities. A corollary is that other mediators, in this case the news-media, may also communicate (uncontrollable) messages such that a consumer audience may be told of negative news that may then be associated with the sponsor. When sponsors cancel contracts it is assumed that their goal is to control the links between their brand and a negative referent. Balance theory is used to discuss the potential effects of negative off-field behaviours of athletes on sponsor’s objectives. Heider’s balance theory (1958) explains that individuals prefer to evaluate linked individuals or entities consistently. In the sponsorship context this presents the possibility that a negative evaluation of the athlete’s behaviour will contribute to correspondingly negative evaluations of the athlete’s team and sponsors. A content analysis (Study 1) was used to survey the types of athlete off-field behaviours commonly reported in a newspaper. In order to provide a local context for the research, articles from the Courier Mail were sampled and teams in the National Rugby League (NRL) competition were the focus of the research. The study identified nearly 2000 articles referring to the NRL competition; 258 of those refer to off-field incidents involving athletes. The various types of behaviours reported include assault, sexual assault allegations, driving under the influence of alcohol, illicit drug use, breaches of club rules, and positive off-field activities (i.e., charitable activities). An experiment (Study 2) tested three news’ article stimuli developed from the behaviours identified in Study 1 in a between-subjects design. A measure of Identification with the Team was used as a covariate variable in the Multivariate Analysis of Covariance analysis. Social identity theory suggests that when an individual identifies with a group, their attitudes and behaviours towards both in- and out-group members are modified. Use of Identification with the Team as a covariate acknowledges that respondents will evaluate behaviours differently according to the attribution of those behaviours to an in- or out-group member. Findings of the research suggest that the news’ article stimuli have significant, large effects on evaluations of athlete off-field behaviour and athlete Likability. Consistent with pretest results, charitable fundraising is regarded as extremely positive; the athlete, correspondingly, is likable. Assault is evaluated as extremely negative, and the athlete as unlikable. DUI scores reveal that the athlete’s behaviour is very negative; however, the athlete’s likability was evaluated as neutral. Treatment group does not produce any significant effects on team or sponsor variables. This research also finds that Identification with the Team has significant, large effects on team variables (Attitude toward the Brand and Corporate Image). Identification also has a significant large effect on athlete Likability, but not on Attitude toward the Act. Identification with the Team does not produce any significant effects on sponsor variables. The results of this research suggest that sponsor’s consumer-based objectives are not threatened by newspaper reports linking athlete off-field behaviour with their brand. Evaluations of sponsor variables (Attitude toward the Sponsor’s Brand and Corporate Image) were consistently positive. Variance in that data, however, cannot be attributed to experimental stimuli or Identification with the Team. These results argue that respondents may regard sponsorships, in principle, as good. Although it is good news for sponsors that negative evaluations of athletes will not produce correspondingly negative evaluations of consumer-based sponsorship objectives, the results indicate problems for sponsorship managers. The failure of Identification with the Team to explain sponsor variable variance indicates that the sponsor has not been evaluated as a linked entity in a relationship with the sporting team and athlete in this research. This result argues that the sponsee-mediated affective communication path that sponsors aim use to communicate with desirable publics is not necessarily a path available to them.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Denial-of-service (DoS) attacks are a growing concern to networked services like the Internet. In recent years, major Internet e-commerce and government sites have been disabled due to various DoS attacks. A common form of DoS attack is a resource depletion attack, in which an attacker tries to overload the server's resources, such as memory or computational power, rendering the server unable to service honest clients. A promising way to deal with this problem is for a defending server to identify and segregate malicious traffic as earlier as possible. Client puzzles, also known as proofs of work, have been shown to be a promising tool to thwart DoS attacks in network protocols, particularly in authentication protocols. In this thesis, we design efficient client puzzles and propose a stronger security model to analyse client puzzles. We revisit a few key establishment protocols to analyse their DoS resilient properties and strengthen them using existing and novel techniques. Our contributions in the thesis are manifold. We propose an efficient client puzzle that enjoys its security in the standard model under new computational assumptions. Assuming the presence of powerful DoS attackers, we find a weakness in the most recent security model proposed to analyse client puzzles and this study leads us to introduce a better security model for analysing client puzzles. We demonstrate the utility of our new security definitions by including two hash based stronger client puzzles. We also show that using stronger client puzzles any protocol can be converted into a provably secure DoS resilient key exchange protocol. In other contributions, we analyse DoS resilient properties of network protocols such as Just Fast Keying (JFK) and Transport Layer Security (TLS). In the JFK protocol, we identify a new DoS attack by applying Meadows' cost based framework to analyse DoS resilient properties. We also prove that the original security claim of JFK does not hold. Then we combine an existing technique to reduce the server cost and prove that the new variant of JFK achieves perfect forward secrecy (the property not achieved by original JFK protocol) and secure under the original security assumptions of JFK. Finally, we introduce a novel cost shifting technique which reduces the computation cost of the server significantly and employ the technique in the most important network protocol, TLS, to analyse the security of the resultant protocol. We also observe that the cost shifting technique can be incorporated in any Diffine{Hellman based key exchange protocol to reduce the Diffie{Hellman exponential cost of a party by one multiplication and one addition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Robust hashing is an emerging field that can be used to hash certain data types in applications unsuitable for traditional cryptographic hashing methods. Traditional hashing functions have been used extensively for data/message integrity, data/message authentication, efficient file identification and password verification. These applications are possible because the hashing process is compressive, allowing for efficient comparisons in the hash domain but non-invertible meaning hashes can be used without revealing the original data. These techniques were developed with deterministic (non-changing) inputs such as files and passwords. For such data types a 1-bit or one character change can be significant, as a result the hashing process is sensitive to any change in the input. Unfortunately, there are certain applications where input data are not perfectly deterministic and minor changes cannot be avoided. Digital images and biometric features are two types of data where such changes exist but do not alter the meaning or appearance of the input. For such data types cryptographic hash functions cannot be usefully applied. In light of this, robust hashing has been developed as an alternative to cryptographic hashing and is designed to be robust to minor changes in the input. Although similar in name, robust hashing is fundamentally different from cryptographic hashing. Current robust hashing techniques are not based on cryptographic methods, but instead on pattern recognition techniques. Modern robust hashing algorithms consist of feature extraction followed by a randomization stage that introduces non-invertibility and compression, followed by quantization and binary encoding to produce a binary hash output. In order to preserve robustness of the extracted features, most randomization methods are linear and this is detrimental to the security aspects required of hash functions. Furthermore, the quantization and encoding stages used to binarize real-valued features requires the learning of appropriate quantization thresholds. How these thresholds are learnt has an important effect on hashing accuracy and the mere presence of such thresholds are a source of information leakage that can reduce hashing security. This dissertation outlines a systematic investigation of the quantization and encoding stages of robust hash functions. While existing literature has focused on the importance of quantization scheme, this research is the first to emphasise the importance of the quantizer training on both hashing accuracy and hashing security. The quantizer training process is presented in a statistical framework which allows a theoretical analysis of the effects of quantizer training on hashing performance. This is experimentally verified using a number of baseline robust image hashing algorithms over a large database of real world images. This dissertation also proposes a new randomization method for robust image hashing based on Higher Order Spectra (HOS) and Radon projections. The method is non-linear and this is an essential requirement for non-invertibility. The method is also designed to produce features more suited for quantization and encoding. The system can operate without the need for quantizer training, is more easily encoded and displays improved hashing performance when compared to existing robust image hashing algorithms. The dissertation also shows how the HOS method can be adapted to work with biometric features obtained from 2D and 3D face images.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Examining the evolution of British and Australian policing, this comparative review of the literature considers the historical underpinnings of policing in these two countries and the impact of community legitimacy derived from the early concepts of policing by consent. Using the August 2011 disorder in Britain as a lens, this paper considers whether, in striving to maintain community confidence, undue emphasis is placed on the police's public image at the expense of community safety. Examining the path of policing reform, the impact of bureaucracy on policing and the evolving debate surrounding police performance, this review suggests that, while largely delivering on the ideal of an ethical and strong police force, a preoccupation with self-image may in fact result in tarnishing the very thing British and Australian police forces strive to achieve – their standing with the public. This paper advocates for a more realistic goal of gaining public respect rather than affection in order to achieve the difficult balance between maintaining trust and respect as an approachable, ethical entity providing firm, confident policing in this ever-evolving, modern society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As highlighted by previous work in Normal Accident Theory1 and High Reliability Organisations, 2 the ability of a system to be flexible is of critical importance to its capability to prepare for, respond to, and recover from disturbance and disasters. This paper proposes that the research into ‘edge organisations’3 and ‘agility’4 is a potential means to operationalise components that embed high reliable traits in the management and oversight of critical infrastructure systems. Much prior work has focused on these concepts in a military frame whereas the study reported on here examines the application of these concepts to aviation infrastructure, specifically, a commercial international airport. As a commercial entity functions in a distinct manner from a military organisation this study aims to better understand the complementary and contradictory components of the application of agility work to a commercial context. Findings highlight the challenges of making commercial operators of infrastructure systems agile as well as embedding traits of High Reliability in such complex infrastructure settings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present three counterfeiting attacks on the block-wise dependent fragile watermarking schemes. We consider vulnerabilities such as the exploitation of a weak correlation among block-wise dependent watermarks to modify valid watermarked %(medical or other digital) images, where they could still be verified as authentic, though they are actually not. Experimental results successfully demonstrate the practicability and consequences of the proposed attacks for some relevant schemes. The development of the proposed attack models can be used as a means to systematically examine the security levels of similar watermarking schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

CubIT is a multi-user, large-scale presentation and collaboration framework installed at the Queensland University of Technology’s (QUT) Cube facility, an interactive facility made up 48 multi-touch screens and very large projected display screens. CubIT was built to make the Cube facility accessible to QUT’s academic and student population. The system allows users to upload, interact with and share media content on the Cube’s very large display surfaces. CubIT implements a unique combination of features including RFID authentication, content management through multiple interfaces, multi-user shared workspace support, drag and drop upload and sharing, dynamic state control between different parts of the system and execution and synchronisation of the system across multiple computing nodes.