999 resultados para Algebraic approaches
Resumo:
Stream ciphers are encryption algorithms used for ensuring the privacy of digital telecommunications. They have been widely used for encrypting military communications, satellite communications, pay TV encryption and for voice encryption of both fixed lined and wireless networks. The current multi year European project eSTREAM, which aims to select stream ciphers suitable for widespread adoptation, reflects the importance of this area of research. Stream ciphers consist of a keystream generator and an output function. Keystream generators produce a sequence that appears to be random, which is combined with the plaintext message using the output function. Most commonly, the output function is binary addition modulo two. Cryptanalysis of these ciphers focuses largely on analysis of the keystream generators and of relationships between the generator and the keystream it produces. Linear feedback shift registers are widely used components in building keystream generators, as the sequences they produce are well understood. Many types of attack have been proposed for breaking various LFSR based stream ciphers. A recent attack type is known as an algebraic attack. Algebraic attacks transform the problem of recovering the key into a problem of solving multivariate system of equations, which eventually recover the internal state bits or the key bits. This type of attack has been shown to be effective on a number of regularly clocked LFSR based stream ciphers. In this thesis, algebraic attacks are extended to a number of well known stream ciphers where at least one LFSR in the system is irregularly clocked. Applying algebriac attacks to these ciphers has only been discussed previously in the open literature for LILI-128. In this thesis, algebraic attacks are first applied to keystream generators using stop-and go clocking. Four ciphers belonging to this group are investigated: the Beth-Piper stop-and-go generator, the alternating step generator, the Gollmann cascade generator and the eSTREAM candidate: the Pomaranch cipher. It is shown that algebraic attacks are very effective on the first three of these ciphers. Although no effective algebraic attack was found for Pomaranch, the algebraic analysis lead to some interesting findings including weaknesses that may be exploited in future attacks. Algebraic attacks are then applied to keystream generators using (p; q) clocking. Two well known examples of such ciphers, the step1/step2 generator and the self decimated generator are investigated. Algebraic attacks are shown to be very powerful attack in recovering the internal state of these generators. A more complex clocking mechanism than either stop-and-go or the (p; q) clocking keystream generators is known as mutual clock control. In mutual clock control generators, the LFSRs control the clocking of each other. Four well known stream ciphers belonging to this group are investigated with respect to algebraic attacks: the Bilateral-stop-and-go generator, A5/1 stream cipher, Alpha 1 stream cipher, and the more recent eSTREAM proposal, the MICKEY stream ciphers. Some theoretical results with regards to the complexity of algebraic attacks on these ciphers are presented. The algebraic analysis of these ciphers showed that generally, it is hard to generate the system of equations required for an algebraic attack on these ciphers. As the algebraic attack could not be applied directly on these ciphers, a different approach was used, namely guessing some bits of the internal state, in order to reduce the degree of the equations. Finally, an algebraic attack on Alpha 1 that requires only 128 bits of keystream to recover the 128 internal state bits is presented. An essential process associated with stream cipher proposals is key initialization. Many recently proposed stream ciphers use an algorithm to initialize the large internal state with a smaller key and possibly publicly known initialization vectors. The effect of key initialization on the performance of algebraic attacks is also investigated in this thesis. The relationships between the two have not been investigated before in the open literature. The investigation is conducted on Trivium and Grain-128, two eSTREAM ciphers. It is shown that the key initialization process has an effect on the success of algebraic attacks, unlike other conventional attacks. In particular, the key initialization process allows an attacker to firstly generate a small number of equations of low degree and then perform an algebraic attack using multiple keystreams. The effect of the number of iterations performed during key initialization is investigated. It is shown that both the number of iterations and the maximum number of initialization vectors to be used with one key should be carefully chosen. Some experimental results on Trivium and Grain-128 are then presented. Finally, the security with respect to algebraic attacks of the well known LILI family of stream ciphers, including the unbroken LILI-II, is investigated. These are irregularly clock- controlled nonlinear filtered generators. While the structure is defined for the LILI family, a particular paramater choice defines a specific instance. Two well known such instances are LILI-128 and LILI-II. The security of these and other instances is investigated to identify which instances are vulnerable to algebraic attacks. The feasibility of recovering the key bits using algebraic attacks is then investigated for both LILI- 128 and LILI-II. Algebraic attacks which recover the internal state with less effort than exhaustive key search are possible for LILI-128 but not for LILI-II. Given the internal state at some point in time, the feasibility of recovering the key bits is also investigated, showing that the parameters used in the key initialization process, if poorly chosen, can lead to a key recovery using algebraic attacks.
Resumo:
This research has established, through ultrasound, near infrared spectroscopy and biomechanics experiments, parameters and parametric relationships that can form the framework for quantifying the integrity of the articular cartilage-on-bone laminate, and objectively distinguish between normal/healthy and abnormal/degenerated joint tissue, with a focus on articular cartilage. This has been achieved by: 1. using traditional experimental methods to produce new parameters for cartilage assessment; 2. using novel methodologies to develop new parameters; and 3. investigating the interrelationships between mechanical, structural and molec- ular properties to identify and select those parameters and methodologies that can be used in a future arthroscopic probe based on points 1 and 2. By combining the molecular, micro- and macro-structural characteristics of the tissue with its mechanical properties, we arrive at a set of critical benchmarking parameters for viable and early-stage non-viable cartilage. The interrelationships between these characteristics, examined using a multivariate analysis based on principal components analysis, multiple linear regression and general linear modeling, could then to deter- mine those parameters and relationships which have the potential to be developed into a future clinical device. Specifically, this research has found that the ultrasound and near infrared techniques can subsume the mechanical parameters and combine to characterise the tissue at the molecular, structural and mechanical levels over the full depth of the cartilage matrix. It is the opinion in this thesis that by enabling the determination of the precise area of in uence of a focal defect or disease in the joint, demarcating the boundaries of articular cartilage with dierent levels of degeneration around a focal defect, better surgical decisions that will advance the processes of joint management and treatment will be achieved. Providing the basis for a surgical tool, this research will contribute to the enhancement and quanti�cation of arthroscopic procedures, extending to post- treatment monitoring and as a research tool, will enable a robust method for evaluating developing (particularly focalised) treatments.
Resumo:
Nitrous oxide (N2O) is primarily produced by the microbially-mediated nitrification and denitrification processes in soils. It is influenced by a suite of climate (i.e. temperature and rainfall) and soil (physical and chemical) variables, interacting soil and plant nitrogen (N) transformations (either competing or supplying substrates) as well as land management practices. It is not surprising that N2O emissions are highly variable both spatially and temporally. Computer simulation models, which can integrate all of these variables, are required for the complex task of providing quantitative determinations of N2O emissions. Numerous simulation models have been developed to predict N2O production. Each model has its own philosophy in constructing simulation components as well as performance strengths. The models range from those that attempt to comprehensively simulate all soil processes to more empirical approaches requiring minimal input data. These N2O simulation models can be classified into three categories: laboratory, field and regional/global levels. Process-based field-scale N2O simulation models, which simulate whole agroecosystems and can be used to develop N2O mitigation measures, are the most widely used. The current challenge is how to scale up the relatively more robust field-scale model to catchment, regional and national scales. This paper reviews the development history, main construction components, strengths, limitations and applications of N2O emissions models, which have been published in the literature. The three scale levels are considered and the current knowledge gaps and challenges in modelling N2O emissions from soils are discussed.
Resumo:
Increasingly, software is no longer developed as a single system, but rather as a smart combination of so-called software services. Each of these provides an independent, specific and relatively small piece of functionality, which is typically accessible through the Internet from internal or external service providers. To the best of our knowledge, there are no standards or models that describe the sourcing process of these software based services (SBS). We identify the sourcing requirements for SBS and associate the key characteristics of SBS (with the sourcing requirements introduced). Furthermore, we investigate the sourcing of SBS with the related works in the field of classical procurement, business process outsourcing, and information systems sourcing. Based on the analysis, we conclude that the direct adoption of these approaches for SBS is not feasible and new approaches are required for sourcing SBS.
Resumo:
Increasingly, software is no longer developed as a single system, but rather as a smart combination of so-called software services. Each of these provides an independent, specific and relatively small piece of functionality, which is typically accessible through the Internet from internal or external service providers. There are no standards or models that describe the sourcing process of these software based services (SBS). The authors identify the sourcing requirements for SBS and associate the key characteristics of SBS (with the sourcing requirements introduced). Furthermore, this paper investigates the sourcing of SBS with the related works in the field of classical procurement, business process outsourcing, and information systems sourcing. Based on the analysis, the authors conclude that the direct adoption of these approaches for SBS is not feasible and new approaches are required for sourcing SBS.
Resumo:
The study of organizations goes to the roots of social science. Abundant theory provides the basis for explanations of diverse aspects of organizational structure and process. As a subset of organizations, nonprofit organizations can be studied with many of the same theoretical approaches used for studying other organizations. Still, nonprofit organizations have some special characteristics, such as a multiplicity of stakeholders and the use of volunteers; some theories of organizations can therefore be expected to be especially useful for studying nonprofit organizations and some other theories not to be very useful. In general, our approach is to apply relevant organizational theory to nonprofit organizations. As such, this essay is not a typical review of literature about nonprofit organizations. Instead, the purpose is to equip the reader with conceptual and theoretical tools for understanding nonprofits as organizations.
Resumo:
Infrastructure organisations such as airport, seaport, rail and road are operating in an increasingly challenging business environment as a result of globalisation, privatisation and deregulation. These organisations must ensure that their main resource i.e. their infrastructure assets are well managed in order to support their business operations. Brisbane Airport is used as a case study to understand the challenges faced in the management of infrastructure assets as well as the approaches used to overcome them. The findings can be useful in helping asset managers to identify the resources they should seek to manipulate in order to make improvement to their activities and contribute to the overall performance of their organisation.
Resumo:
Enterprise Architectures have emerged as comprehensive corporate artefacts that provide structure to the plethora of conceptual views on an enterprise. The recent popularity of a service-oriented design of organizations has added service and related constructs as a new element that requires consideration within an Enterprise Architecture. This paper analyzes and compares the existing proposals for how to best integrate services into Enterprise Architectures. It uses the popular Zachman Framework as an example and differentiates the existing integration alternatives. This research can be generalized beyond service integration into an investigation onto how to possibly extend Enterprise Architectures with emerging constructs.
Resumo:
The robust economic growth across South East Asia and the significant advances in nano-technologies in the past two decades have resulted in the creation of intelligent urban infrastructures. Cities like Seoul, Tokyo and Hong Kong have been competing against each other to develop the first ‘ubiquitous city’, a strategic global node of science and technology that provides all municipal services for residents and visitors via ubiquitous infrastructures. This chapter scrutinises the development of ubiquitous and smart infrastructure in Korea, Japan and Hong Kong. These cases provide invaluable learnings for policy-makers and urban and infrastructure planners when considering adopting these systems approaches in their cities.
Resumo:
This work examines the algebraic cryptanalysis of small scale variants of the LEX-BES. LEX-BES is a stream cipher based on the Advanced Encryption Standard (AES) block cipher. LEX is a generic method proposed for constructing a stream cipher from a block cipher, initially introduced by Biryukov at eSTREAM, the ECRYPT Stream Cipher project in 2005. The Big Encryption System (BES) is a block cipher introduced at CRYPTO 2002 which facilitates the algebraic analysis of the AES block cipher. In this article, experiments were conducted to find solutions of equation systems describing small scale LEX-BES using Gröbner Basis computations. This follows a similar approach to the work by Cid, Murphy and Robshaw at FSE 2005 that investigated algebraic cryptanalysis on small scale variants of the BES. The difference between LEX-BES and BES is that due to the way the keystream is extracted, the number of unknowns in LEX-BES equations is fewer than the number in BES. As far as the authors know, this attempt is the first at creating solvable equation systems for stream ciphers based on the LEX method using Gröbner Basis computations.
Resumo:
To accurately and effectively simulate large deformation is one of the major challenges in numerical modeling of metal forming. In this paper, an adaptive local meshless formulation based on the meshless shape functions and the local weak-form is developed for the large deformation analysis. Total Lagrangian (TL) and the Updated Lagrangian (UL) approaches are used and thoroughly compared each other in computational efficiency and accuracy. It has been found that the developed meshless technique provides a superior performance to the conventional FEM in dealing with large deformation problems for metal forming. In addition, the TL has better computational efficiency than the UL. However, the adaptive analysis is much more efficient using the UL approach than using in the TL approach.
Resumo:
This paper documents some preliminary findings arising from our Creative Industries Faculty’s invitation to academics to submit suitable proposals for Internationalising the Curriculum, an initiative that aligns with the University’s recognition of the importance of “building international components into their teaching programs” Our research project involves revisiting the literature on internationalising the curriculum with a view to implementing pedagogic and assessment strategies that respect and encourage intercultural and international understandings and competencies. The paper addresses the problems in designing such a unit; in this case an American Literature unit which will be taught and studied in Australia at QUT in 2011. The challenges inherent in the task of internationalising the curriculum stem from the ‘traditional’ and accepted ways of structuring and delivering such units. While the content may be international, the problem remains as to how to go about teaching and assessing the unit to achieve a global approach. How can it be taught in a way that steps outside the borders of our national teaching practices and understanding of western epistemology and becomes far more inclusive of other modes of knowledge?
Resumo:
Over the past twenty years, the conventional knowledge management approach has evolved into a strategic management approach that has found applications and opportunities outside of business, in society at large, through education, urban development, governance, and healthcare, amongst others. Knowledge-Based Development for Cities and Socieities: Integrated Multi-Level Approaches enlightens the concepts and challenges of knowledge management for both urban environments and entire regions, enhancing the expertise and knowledge of scholars, resdearchers, practitioners, managers and urban developers in the development of successful knowledge-based development policies, creation of knowledte cities and prosperous knowledge societies. This reference creates large knowledge base for scholars, managers and urban developers and increases the awareness of the role of knowledge cities and knowledge socieiteis in the knowledge era, as well as of the challenges and opportunities for future research.
Resumo:
Designing trajectories for a submerged rigid body motivates this paper. Two approaches are addressed: the time optimal approach and the motion planning ap- proach using concatenation of kinematic motions. We focus on the structure of singular extremals and their relation to the existence of rank-one kinematic reduc- tions; thereby linking the optimization problem to the inherent geometric frame- work. Using these kinematic reductions, we provide a solution to the motion plan- ning problem in the under-actuated scenario, or equivalently, in the case of actuator failures. We finish the paper comparing a time optimal trajectory to one formed by concatenation of pure motions.