64 resultados para initialisation flaws


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A5/1 is a shift register based stream cipher which provides privacy for the GSM system. In this paper, we analyse the loading of the secret key and IV during the initialisation process of A5/1. We demonstrate the existence of weak key-IV pairs in the A5/1 cipher due to this loading process; these weak key-IV pairs may generate one, two or three registers containing all-zero values, which may lead in turn to weak keystream sequences. In the case where two or three registers contain only zeros, we describe a distinguisher which leads to a complete decryption of the affected messages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Actin is the most abundantly distributed protein in living cells which plays critical roles in the cell interior force generation and transmission. The fracture mechanism of microfilament networks, whose principle component is actin, would provide insights which can contribute to the understandings of self-protective characters of cytoskeleton. In this study, molecular simulations are conducted to investigate the molecular mechanisms of disruption of microfilament networks from the viewpoint of biophysics. By employing a coarse-grained (CG) model of actin filament networks, we focused on the ultimate strength and crack growth mode of microfilament networks that have dependency on the crack length. It can be found that, the fracture mechanism of microfilament network has dependency on the structural properties of microfilament networks. The structure flaws marginally change the strength of microfilament networks which would explain the self-protective characters of cytoskeleton.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The perceived desirability of water views continues to lead to increasing numbers relocating to coastal regions. Proximity to coastal water brings with it unique risks from rising sea levels; however, water can present a risk in any area, whether or not you have water views. Recent Australian and international disasters show that even inland populations not located in traditional flood areas are not immune from water risks. The author examines the nature of these risks and shows how the internet can be used as a tool in identifying risk areas. The author also highlights the need to ensure accuracy of the data for valuation and planning purposes and identifies flaws in the current data provision.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scientific visualisations such as computer-based animations and simulations are increasingly a feature of high school science instruction. Visualisations are adopted enthusiastically by teachers and embraced by students, and there is good evidence that they are popular and well received. There is limited evidence, however, of how effective they are in enabling students to learn key scientific concepts. This paper reports the results of a quantitative study conducted in Australian chemistry classrooms. The visualisations chosen were from free online sources, intended to model the ways in which classroom teachers use visualisations, but were found to have serious flaws for conceptual learning. There were also challenges in the degree of interactivity available to students using the visualisations. Within these limitations, no significant difference was found for teaching with and without these visualisations. Further study using better designed visualisations and with explicit attention to the pedagogy surrounding the visualisations will be required to gather high quality evidence of the effectiveness of visualisations for conceptual development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rakaposhi is a synchronous stream cipher, which uses three main components: a non-linear feedback shift register (NLFSR), a dynamic linear feedback shift register (DLFSR) and a non-linear filtering function (NLF). NLFSR consists of 128 bits and is initialised by the secret key K. DLFSR holds 192 bits and is initialised by an initial vector (IV). NLF takes 8-bit inputs and returns a single output bit. The work identifies weaknesses and properties of the cipher. The main observation is that the initialisation procedure has the so-called sliding property. The property can be used to launch distinguishing and key recovery attacks. The distinguisher needs four observations of the related (K,IV) pairs. The key recovery algorithm allows to discover the secret key K after observing 29 pairs of (K,IV). Based on the proposed related-key attack, the number of related (K,IV) pairs is 2(128 + 192)/4 pairs. Further the cipher is studied when the registers enter short cycles. When NLFSR is set to all ones, then the cipher degenerates to a linear feedback shift register with a non-linear filter. Consequently, the initial state (and Secret Key and IV) can be recovered with complexity 263.87. If DLFSR is set to all zeros, then NLF reduces to a low non-linearity filter function. As the result, the cipher is insecure allowing the adversary to distinguish it from a random cipher after 217 observations of keystream bits. There is also the key recovery algorithm that allows to find the secret key with complexity 2 54.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formal representations of business processes are used for analysis of the process behavior. Workflow nets are a widely used formalism for describing the behavior of business processes. Structure theory of processes investigates the relation between the structure of a model and its behavior. In this paper, we propose to employ the connectivity property of workflow nets as an angle to their structural analysis. In particular, we show how soundness verification can be organized using biconnected components of a workflow net. This allows for efficient identification and localization of flaws in the behavior of workflow nets and for supporting process analysts with diagnostic information

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: Concentrations of troponin measured with high sensitivity troponin assays are raised in a number of emergency department (ED) patients; however many are not diagnosed with acute myocardial infarction (AMI). Clinical comparisons between the early use (2 h after presentation) of high sensitivity cardiac troponin T (hs-cTnT) and I (hs-cTnI) assays for the diagnosis of AMI have not been reported. Design and methods: Early (0 h and 2 h) hs-cTnT and hs-cTnI assay results in 1571 ED patients with potential acute coronary syndrome (ACS) without ST elevation on electrocardiograph (ECG) were evaluated. The primary outcome was diagnosis of index MI adjudicated by cardiologists using the local cTnI assay results taken ≥6 h after presentation, ECGs and clinical information. Stored samples were later analysed with hs-cTnT and hs-cTnI assays. Results: The ROC analysis for AMI (204 patients; 13.0%) for hs-cTnT and hs-cTnI after 2 h was 0.95 (95% CI: 0.94–0.97) and 0.98 (95% CI: 0.97–0.99) respectively. The sensitivity, specificity, PLR, and NLR of hs-cTnT and hs-cTnI for AMI after 2 h were 94.1% (95% CI: 90.0–96.6) and 95.6% (95% CI: 91.8–97.7), 79.0% (95% CI: 76.8–81.1) and 92.5% (95% CI: 90.9–93.7), 4.48 (95% CI: 4.02–5.00) and 12.86 (95% CI: 10.51–15.31), and 0.07 (95% CI: 0.04–0.13) and 0.05 (95% CI:0.03–0.09) respectively. Conclusions: Exclusion of AMI 2 h after presentation in emergency patients with possible ACS can be achieved using hs-cTnT or hs-cTnI assays. Significant differences in specificity of these assays are relevant and if using the hs-cTnT assay, further clinical assessment in a larger proportion of patients would be required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

"This work forms part of a much larger collaborative album project in progress between Tim Bruniges, Julian Knowles and David Trumpmanis which explores the intersections between traditional rock instrumentation and analogue and digital media. All of the creative team are performers, composers and producers. The material for the album was thus generated by a series of in studio improvisations and performances with each collaborator assuming a range of different and alternating roles – guitars, electronics, drums, percussion, bass, keyboards production. Thematically the work explores the intersection of instrumental (post) rock, ambient music, and historical electro-acoustic tape composition traditions. Over the past 10 years, musical practice has become increasingly hybrid, with the traditional boundaries between genre becoming progressively eroded. At the same time, digital tools have replaced many of the major analogue technologies that dominated music production and performance in the 20th century. The disappearance of analogue media in mainstream musical practice has had a profound effect on the sonic characteristics of contemporary music and the gestural basis for its production. Despite the increasing power of digital technologies, a small but dedicated group of practitioners has continued to prize and use analogue technology for its unique sounds and the non-linearity of the media, aestheticising its inherent limitations and flaws. At the most radical end of this spectrum lie glitch and lo-fi musical forms, seen in part as reactions to the clinical nature of digital media and the perceived lack of character associated with its transparency. Such developments have also problematised the traditional relationships between media and genre, where specific techniques and their associated sounds have become genre markers. Tristate is an investigation into this emerging set of dialogues between analogue and digital media across composition, production and performance. It employs analogue tape loops in performance, where a tape machine ‘performer’ records and hand manipulates loops of an electric guitar performer on ‘destroyed’ tape stock (intentionally damaged tape), processing the output of this analogue system in the digital domain with contemporary sound processors. In doing so it investigates how the most extreme sonic signatures of analogue media – tape dropout and noise – can be employed alongside contemporary digital sound gestures in both compositional and performance contexts and how the extremes of the two media signatures can brought together both compositionally and performatively. In respect of genre, the work established strategies for merging compositional techniques from the early musique concrete tradition of the 1940s with late 60s popular music experimentalism and the laptop glitch electronica movement of the early 2000s. Lastly, the work explores how analogue recording studio technologies can be used as performance tools, thus illuminating and foregrounding the performative/gestural dimensions of traditional analogue studio tools in use."

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a reliable and ubiquitous group key distribution scheme that is suitable for ad hoc networks. The scheme has self-initialisation and self-securing features. The former feature allows a cooperation of an arbitrary number of nodes to initialise the system, and it also allows node admission to be performed in a decentralised fashion. The latter feature allows a group member to determine the group key remotely while maintaining the system security. We also consider a decentralised solution of establishing secure point-to-point communication. The solution allows a new node to establish a secure channel with every existing node if it has pre-existing secure channels with a threshold number of the existing nodes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The conversion of one-way polyethylene terephthalate (PET) bottles into reusable bottles helps reduce environmental burden. Recently, the Ministry of the Environment in Japan began discussing the introduction of reusable bottles. One of the barriers to introducing the new type of bottle is consumer unwillingness to accept refilled reusable bottles. We administered the questionnaires to consumers in a pilot test on reusable PET bottles organized to analyze the demand for these products. To increase the demand for refilled reusable bottles, it is necessary to supply refilled reusable bottles that are acceptable to consumers who are concerned about container flaws and stains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The noble idea of studying seminal works to ‘see what we can learn’ has turned in the 1990s into ‘let’s see what we can take’ and in the last decade a more toxic derivative ‘what else can’t we take’. That is my observation as a student of architecture in the 1990s, and as a practitioner in the 2000s. In 2010, the sense that something is ending is clear. The next generation is rising and their gaze has shifted. The idea of classification (as a means of separation) was previously rejected by a generation of Postmodernists; the usefulness of difference declined. It’s there in the presence of plurality in the resulting architecture, a decision to mine history and seize in a willful manner. This is a process of looking back but never forward. It has been a mono-culture of absorption. The mono-culture rejected the pursuit of the realistic. It is a blanket suffocating all practice of architecture in this country from the mercantile to the intellectual. Independent reviews of Australia’s recent contributions to the Venice Architecture Biennales confirm the malaise. The next generation is beginning to reconsider classification as a means of unification. By acknowledging the characteristics of competing forces it is possible to bring them into a state of tension. Seeking a beautiful contrast is a means to a new end. In the political setting, this is described by Noel Pearson as the radical centre[1]. The concept transcends the political and in its most essential form is a cultural phenomenon. It resists the compromised position and suggests that we can look back while looking forward. The radical centre is the only demonstrated opportunity where it is possible to pursue a realistic architecture. A realistic architecture in Australia may be partially resolved by addressing our anxiety of permanence. Farrelly’s built desires[2] and Markham’s ritual demonstrations[3] are two ways into understanding the broader spectrum of permanence. But I think they are downstream of our core problem. Our problem, as architects, is that we are yet to come to terms with this place. Some call it landscape others call it country. Australian cities were laid out on what was mistaken for a blank canvas. On some occasions there was the consideration of the landscape when it presented insurmountable physical obstacles. The architecture since has continued to work on its piece of a constantly blank canvas. Even more ironic is the commercial awards programs that represent a claim within this framework but at best can only establish a dialogue within itself. This is a closed system unable to look forward. It is said that Melbourne is the most European city in the southern hemisphere but what is really being described there is the limitation of a senseless grid. After all, if Dutch landscape informs Dutch architecture why can’t the Australian landscape inform Australian architecture? To do that, we would have to acknowledge our moribund grasp of the meaning of the Australian landscape. Or more precisely what Indigenes call Country[4]. This is a complex notion and there are different ways into it. Country is experienced and understood through the senses and seared into memory. If one begins design at that starting point it is not unreasonable to think we can arrive at an end point that is a counter trajectory to where we have taken ourselves. A recent studio with Masters students confirmed this. Start by finding Country and it would be impossible to end up with a building looking like an Aboriginal man’s face. To date architecture in Australia has overwhelmingly ignored Country on the back of terra nullius. It can’t seem to get past the picturesque. Why is it so hard? The art world came to terms with this challenge, so too did the legal establishment, even the political scene headed into new waters. It would be easy to blame the budgets of commerce or the constraints of program or even the pressure of success. But that is too easy. Those factors are in fact the kind of limitations that opportunities grow out of. The past decade of economic plenty has, for the most part, smothered the idea that our capitals might enable civic settings or an architecture that is able to looks past lot line boundaries in a dignified manner. The denied opportunities of these settings to be prompted by the Country they occupy is criminal. The public realm is arrested in its development because we refuse to accept Country as a spatial condition. What we seem to be able to embrace is literal and symbolic gestures usually taking the form of a trumped up art installations. All talk – no action. To continue to leave the public realm to the stewardship of mercantile interests is like embracing derivative lending after the global financial crisis.Herein rests an argument for why we need a resourced Government Architect’s office operating not as an isolated lobbyist for business but as a steward of the public realm for both the past and the future. New South Wales is the leading model with Queensland close behind. That is not to say both do not have flaws but current calls for their cessation on the grounds of design parity poorly mask commercial self interest. In Queensland, lobbyists are heavily regulated now with an aim to ensure integrity and accountability. In essence, what I am speaking of will not be found in Reconciliation Action Plans that double as business plans, or the mining of Aboriginal culture for the next marketing gimmick, or even discussions around how to make buildings more ‘Aboriginal’. It will come from the next generation who reject the noxious mono-culture of absorption and embrace a counter trajectory to pursue an architecture of realism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis explored the utility of long-range stereo visual odometry for application on Unmanned Aerial Vehicles. Novel parameterisations and initialisation routines were developed for the long-range case of stereo visual odometry and new optimisation techniques were implemented to improve the robustness of visual odometry in this difficult scenario. In doing so, the applications of stereo visual odometry were expanded and shown to perform adequately in situations that were previously unworkable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.