931 resultados para Dual compressible hybrid quantum secret sharing schemes
Resumo:
In the modern built environment, building construction and demolition consume a large amount of energy and emits greenhouse gasses due to widely used conventional construction materials such as reinforced and composite concrete. These materials consume high amount of natural resources and possess high embodied energy. More energy is required to recycle or reuse such materials at the cessation of use. Therefore, it is very important to use recyclable or reusable new materials in building construction in order to conserve natural resources and reduce the energy and emissions associated with conventional materials. Advancements in materials technology have resulted in the introduction of new composite and hybrid materials in infrastructure construction as alternatives to the conventional materials. This research project has developed a lightweight and prefabricatable Hybrid Composite Floor Plate System (HCFPS) as an alternative to conventional floor system, with desirable properties, easy to construct, economical, demountable, recyclable and reusable. Component materials of HCFPS include a central Polyurethane (PU) core, outer layers of Glass-fiber Reinforced Cement (GRC) and steel laminates at tensile regions. This research work explored the structural adequacy and performance characteristics of hybridised GRC, PU and steel laminate for the development of HCFPS. Performance characteristics of HCFPS were investigated using Finite Element (FE) method simulations supported by experimental testing. Parametric studies were conducted to develop the HCFPS to satisfy static performance using sectional configurations, spans, loading and material properties as the parameters. Dynamic response of HCFPS floors was investigated by conducting parametric studies using material properties, walking frequency and damping as the parameters. Research findings show that HCFPS can be used in office and residential buildings to provide acceptable static and dynamic performance. Design guidelines were developed for this new floor system. HCFPS is easy to construct and economical compared to conventional floor systems as it is lightweight and prefabricatable floor system. This floor system can also be demounted and reused or recycled at the cessation of use due to its component materials.
Resumo:
Understanding network traffic behaviour is crucial for managing and securing computer networks. One important technique is to mine frequent patterns or association rules from analysed traffic data. On the one hand, association rule mining usually generates a huge number of patterns and rules, many of them meaningless or user-unwanted; on the other hand, association rule mining can miss some necessary knowledge if it does not consider the hierarchy relationships in the network traffic data. Aiming to address such issues, this paper proposes a hybrid association rule mining method for characterizing network traffic behaviour. Rather than frequent patterns, the proposed method generates non-similar closed frequent patterns from network traffic data, which can significantly reduce the number of patterns. This method also proposes to derive new attributes from the original data to discover novel knowledge according to hierarchy relationships in network traffic data and user interests. Experiments performed on real network traffic data show that the proposed method is promising and can be used in real applications. Copyright2013 John Wiley & Sons, Ltd.
Resumo:
Food is inherently cultural yet traditionally overlooked in many disciplines as a topic worthy of serious investigation. This thesis investigates how food, as a topic of interest, is thriving in an online environment through recipe sharing on food blogs. It applies an ethnographic approach to online community studies, providing a rich description of the food blogging community. The thesis demonstrates how the food blogging can be seen as a community. Through a case study focusing on a one recipe shared across many blogs, it also examines the community in action. As the community has grown, it has become more complex, structured and diverse. The thesis examines its evolution and the response of food-related media and other industries to food blogging. The nature of the food blogging community reflects the cultural and social nature of food and the ongoing evolution of recipe sharing through food-related media. Food blogs provide an insight into the eating habits of ‘ordinary’ people, in a more broad-based manner than traditional food-related media such as cookbooks. Beyond this, food blogs are part of wider cultural trends towards DIY, and provide a useful example of the ongoing transformation of food-related media, food culture, and indeed, culture more broadly.
Resumo:
This thesis investigates how modern individuals relate to themselves and others in the service of shaping their ethical conduct and governing themselves. It considers the use of online social networking sites (SNSs) as one particular practice through which people manage their day-to-day conduct and understandings of self. Current research on the use of SNSs has conceptualised them as tools for communication, information-sharing and self-presentation. This thesis suggests a different way of thinking about these sites as tools for self-formation. A Foucaultian genealogical, historical and problematising approach is applied in order to explore processes of subjectivation and historical backgrounds involved in the use of SNSs. This is complimented with an ANT-based understanding of the role that technologies play in shaping human action. Drawing new connections between three factors will show how they contribute to the ways in which people become selves today. These factors are, one, the psychologisation and rationalisation of modern life that lead people to confess and talk about themselves in order to improve and perfect themselves, two, the transparency or publicness of modern life that incites people to reveal themselves constantly to a public audience and, three, the techno-social hybrid character of Western societies. This thesis will show how some older practices of self-formation have been translated into the context of modern technologised societies and how the care of self has been reinvigorated and combined with the notion of baring self in public. This thesis contributes a different way of thinking about self and the internet that does not seek to define what the modern self is and how it is staged online but rather accounts for the multiple, contingent and historically conditioned processes of subjectivation through which individuals relate to themselves and others in the service of governing their daily conduct.
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.
Resumo:
A dual-scale model of the torrefaction of wood was developed and used to study industrial configurations. At the local scale, the computational code solves the coupled heat and mass transfer and the thermal degradation mechanisms of the wood components. At the global scale, the two-way coupling between the boards and the stack channels is treated as an integral component of the process. This model is used to investigate the effect of the stack configuration on the heat treatment of the boards. The simulations highlight that the exothermic reactions occurring in each single board can be accumulated along the stack. This phenomenon may result in a dramatic eterogeneity of the process and poses a serious risk of thermal runaway, which is often observed in industrial plants. The model is used to explain how thermal runaway can be lowered by increasing the airflow velocity, the sticker thickness or by gas flow reversal.
Resumo:
Availability has become a primary goal of information security and is as significant as other goals, in particular, confidentiality and integrity. Maintaining availability of essential services on the public Internet is an increasingly difficult task in the presence of sophisticated attackers. Attackers may abuse limited computational resources of a service provider and thus managing computational costs is a key strategy for achieving the goal of availability. In this thesis we focus on cryptographic approaches for managing computational costs, in particular computational effort. We focus on two cryptographic techniques: computational puzzles in cryptographic protocols and secure outsourcing of cryptographic computations. This thesis contributes to the area of cryptographic protocols in the following ways. First we propose the most efficient puzzle scheme based on modular exponentiations which, unlike previous schemes of the same type, involves only a few modular multiplications for solution verification; our scheme is provably secure. We then introduce a new efficient gradual authentication protocol by integrating a puzzle into a specific signature scheme. Our software implementation results for the new authentication protocol show that our approach is more efficient and effective than the traditional RSA signature-based one and improves the DoSresilience of Secure Socket Layer (SSL) protocol, the most widely used security protocol on the Internet. Our next contributions are related to capturing a specific property that enables secure outsourcing of cryptographic tasks in partial-decryption. We formally define the property of (non-trivial) public verifiability for general encryption schemes, key encapsulation mechanisms (KEMs), and hybrid encryption schemes, encompassing public-key, identity-based, and tag-based encryption avors. We show that some generic transformations and concrete constructions enjoy this property and then present a new public-key encryption (PKE) scheme having this property and proof of security under the standard assumptions. Finally, we combine puzzles with PKE schemes for enabling delayed decryption in applications such as e-auctions and e-voting. For this we first introduce the notion of effort-release PKE (ER-PKE), encompassing the well-known timedrelease encryption and encapsulated key escrow techniques. We then present a security model for ER-PKE and a generic construction of ER-PKE complying with our security notion.
Resumo:
The success or effectiveness for any aircraft design is a function of many trade-offs. Over the last 100 years of aircraft design these trade-offs have been optimized and dominant aircraft design philosophies have emerged. Pilotless aircraft (or uninhabited airborne systems, UAS) present new challenges in the optimization of their configuration. Recent developments in battery and motor technology have seen an upsurge in the utility and performance of electric powered aircraft. Thus, the opportunity to explore hybrid-electric aircraft powerplant configurations is compelling. This thesis considers the design of such a configuration from an overall propulsive, and energy efficiency perspective. A prototype system was constructed using a representative small UAS internal combustion engine (10cc methanol two-stroke) and a 600W brushless Direct current (BLDC) motor. These components were chosen to be representative of those that would be found on typical small UAS. The system was tested on a dynamometer in a wind-tunnel and the results show an improvement in overall propulsive efficiency of 17% when compared to a non-hybrid powerplant. In this case, the improvement results from the utilization of a larger propeller that the hybrid solution allows, which shows that general efficiency improvements are possible using hybrid configurations for aircraft propulsion. Additionally this approach provides new improvements in operational and mission flexibility (such as the provision of self-starting) which are outlined in the thesis. Specifically, the opportunity to use the windmilling propeller for energy regeneration was explored. It was found (in the prototype configuration) that significant power (60W) is recoverable in a steep dive, and although the efficiency of regeneration is low, the capability can allow several options for improved mission viability. The thesis concludes with the general statement that a hybrid powerplant improves the overall mission effectiveness and propulsive efficiency of small UAS.
Resumo:
The communal nature of knowledge production predicts the importance of creating learning organisations where knowledge arises out of processes that are personal, social, situated and active. It follows that workplaces must provide both formal and informal learning opportunities for interaction with ideas and among individuals. This grounded theory for developing contemporary learning organisations harvests insights from the knowledge management, systems sciences, and educational learning literatures. The resultant hybrid theoretical framework informs practical application, as reported in a case study that harnesses the accelerated information exchange possibilities enabled through web 2.0 social networking and peer production technologies. Through complementary organisational processes, 'meaning making' is negotiated in formal face-to-face meetings supplemented by informal 'boundary spanning' dialogue. The organisational capacity building potential of this participatory and inclusive approach is illustrated through the example of the Dr. Martin Luther King, Jr. Library in San Jose, California, USA. As an outcome of the strategic planning process at this joint city-university library, communication, decision-making, and planning structures, processes, and systems were re-invented. An enterprise- level redesign is presented, which fosters contextualising information interactions for knowledge sharing and community building. Knowledge management within this context envisions organisations as communities where knowledge, identity, and learning are situated. This framework acknowledges the social context of learning - i.e., that knowledge is acquired and understood through action, interaction, and sharing with others. It follows that social networks provide peer-to-peer enculturation through intentional exchange of tacit information made explicit. This, in turn, enables a dynamic process experienced as a continuous spiral that perpetually elevates collective understanding and enables knowledge creation.
Resumo:
We introduce a broad lattice manipulation technique for expressive cryptography, and use it to realize functional encryption for access structures from post-quantum hardness assumptions. Specifically, we build an efficient key-policy attribute-based encryption scheme, and prove its security in the selective sense from learning-with-errors intractability in the standard model.
Resumo:
While dual degree programs (DDPs) between Australian and Indonesian universities are expected to facilitate knowledge transfer (KT) between the partnering universities, little is known about how and what KT process taking place within DDP partnerships. Using an inter-organisational KT framework, this study investigated Indonesian universities’ rationales and outcomes of establishing DDPs and mechanisms facilitating knowledge transfer between Australian and Indonesian universities. Two Indonesian universities along with their common Australian partner university participated in this case study. Semi-structured interviews were conducted with 27 key university officers and pertinent university documents provided the main data. Both data sources were thematically analysed to identify emerging patterns. The findings suggest that Indonesian universities prioritised developing capacity to improve their international recognition more than the Australian partner. Consequently, the DDPs benefited the Indonesian universities through capacity development made possible by KT from the Australian DDP partners. KT processes occurred in DDP partnerships, particularly through curriculum collaboration, but they were more limited for the managerial area. Factors enabling the KT included both technology-aided and face-to-face communication, intention to acquire knowledge from the partners, capitalising on the unequal power relations to advance KT opportunities, and knowledge management system. The findings of this study suggest the importance of prioritising capacity development in DDP partnerships to enable KT, executing the KT stages to ensure institutionalisation of acquired knowledge into the university’s systems and policies, and maintaining financial sustainability of the DDPs to reach mutually beneficial outcomes between Australian and Indonesian universities.
Resumo:
In this paper, we present three counterfeiting attacks on the block-wise dependent fragile watermarking schemes. We consider vulnerabilities such as the exploitation of a weak correlation among block-wise dependent watermarks to modify valid watermarked %(medical or other digital) images, where they could still be verified as authentic, though they are actually not. Experimental results successfully demonstrate the practicability and consequences of the proposed attacks for some relevant schemes. The development of the proposed attack models can be used as a means to systematically examine the security levels of similar watermarking schemes.
Resumo:
In the real world there are many problems in network of networks (NoNs) that can be abstracted to a so-called minimum interconnection cut problem, which is fundamentally different from those classical minimum cut problems in graph theory. Thus, it is desirable to propose an efficient and effective algorithm for the minimum interconnection cut problem. In this paper we formulate the problem in graph theory, transform it into a multi-objective and multi-constraint combinatorial optimization problem, and propose a hybrid genetic algorithm (HGA) for the problem. The HGA is a penalty-based genetic algorithm (GA) that incorporates an effective heuristic procedure to locally optimize the individuals in the population of the GA. The HGA has been implemented and evaluated by experiments. Experimental results have shown that the HGA is effective and efficient.
Resumo:
A one-time program is a hypothetical device by which a user may evaluate a circuit on exactly one input of his choice, before the device self-destructs. One-time programs cannot be achieved by software alone, as any software can be copied and re-run. However, it is known that every circuit can be compiled into a one-time program using a very basic hypothetical hardware device called a one-time memory. At first glance it may seem that quantum information, which cannot be copied, might also allow for one-time programs. But it is not hard to see that this intuition is false: one-time programs for classical or quantum circuits based solely on quantum information do not exist, even with computational assumptions. This observation raises the question, "what assumptions are required to achieve one-time programs for quantum circuits?" Our main result is that any quantum circuit can be compiled into a one-time program assuming only the same basic one-time memory devices used for classical circuits. Moreover, these quantum one-time programs achieve statistical universal composability (UC-security) against any malicious user. Our construction employs methods for computation on authenticated quantum data, and we present a new quantum authentication scheme called the trap scheme for this purpose. As a corollary, we establish UC-security of a recent protocol for delegated quantum computation.