51 resultados para Jacobi Symbol
em Queensland University of Technology - ePrints Archive
Resumo:
This paper provides new results about efficient arithmetic on Jacobi quartic form elliptic curves, y 2 = d x 4 + 2 a x 2 + 1. With recent bandwidth-efficient proposals, the arithmetic on Jacobi quartic curves became solidly faster than that of Weierstrass curves. These proposals use up to 7 coordinates to represent a single point. However, fast scalar multiplication algorithms based on windowing techniques, precompute and store several points which require more space than what it takes with 3 coordinates. Also note that some of these proposals require d = 1 for full speed. Unfortunately, elliptic curves having 2-times-a-prime number of points, cannot be written in Jacobi quartic form if d = 1. Even worse the contemporary formulae may fail to output correct coordinates for some inputs. This paper provides improved speeds using fewer coordinates without causing the above mentioned problems. For instance, our proposed point doubling algorithm takes only 2 multiplications, 5 squarings, and no multiplication with curve constants when d is arbitrary and a = ±1/2.
Resumo:
Process modeling grammars are used to create models of business processes. In this paper, we discuss how different routing symbol designs affect an individual's ability to comprehend process models. We conduct an experiment with 154 students to ascertain which visual design principles influence process model comprehension. Our findings suggest that design principles related to perceptual discriminability and pop out improve comprehension accuracy. Furthermore, semantic transparency and aesthetic design of symbols lower the perceived difficulty of comprehension. Our results inform important principles about notational design of process modeling grammars and the effective use of process modeling in practice.
Resumo:
This paper presents an efficient low-complexity clipping noise compensation scheme for PAR reduced orthogonal frequency division multiple access (OFDMA) systems. Conventional clipping noise compensation schemes proposed for OFDM systems are decision directed schemes which use demodulated data symbols. Thus these schemes fail to deliver expected performance in OFDMA systems where multiple users share a single OFDM symbol and a specific user may only know his/her own modulation scheme. The proposed clipping noise estimation and compensation scheme does not require the knowledge of the demodulated symbols of the other users, making it very promising for OFDMA systems. It uses the equalized output and the reserved tones to reconstruct the signal by compensating the clipping noise. Simulation results show that the proposed scheme can significantly improve the system performance.
Resumo:
The literature on corporate identity management suggests that managing corporate identity is a strategically complex task embracing the shaping of a range of dimensions of organisational life. The performance measurement literature and its applications likewise now also emphasise organisational ability to incorporate various dimensions considering both financial and non-financial performance measures when assessing success. The inclusion of these soft non-financial measures challenges organisations to quantify intangible aspects of performance such as corporate identity, transforming unmeasurables into measurables. This paper explores the regulatory roles of the use of the balanced scorecard in shaping key dimensions of corporate identities in a public sector shared service provider in Australia. This case study employs qualitative interviews of senior managers and employees, secondary data and participant observation. The findings suggest that the use of the balanced scorecard has potential to support identity construction, as an organisational symbol, a communication tool of vision, and as strategy, through creating conversations that self-regulate behaviour. The development of an integrated performance measurement system, the balanced scorecard, becomes an expression of a desired corporate identity, and the performance measures and continuous process provide the resource for interpreting actual corporate identities. Through this process of understanding and mobilising the interaction, it may be possible to create a less obtrusive and more subtle way to control “what an organisation is”. This case study also suggests that the theoretical and practical fusion of the disciplinary knowledge around corporate identities and performance measurement systems could make a contribution to understanding and shaping corporate identities.
Resumo:
This paper improves implementation techniques of Elliptic Curve Cryptography. We introduce new formulae and algorithms for the group law on Jacobi quartic, Jacobi intersection, Edwards, and Hessian curves. The proposed formulae and algorithms can save time in suitable point representations. To support our claims, a cost comparison is made with classic scalar multiplication algorithms using previous and current operation counts. Most notably, the best speeds are obtained from Jacobi quartic curves which provide the fastest timings for most scalar multiplication strategies benefiting from the proposed 12M + 5S + 1D point doubling and 7M + 3S + 1D point addition algorithms. Furthermore, the new addition algorithm provides an efficient way to protect against side channel attacks which are based on simple power analysis (SPA). Keywords: Efficient elliptic curve arithmetic,unified addition, side channel attack.
Resumo:
This paper explores the possibility of including human factoring in a business process model. The importance of doing so is twofold: (1) The organization becomes transparent in its processes as all participants (human, activities and events) are identifiable. (2) Including human factoring allows organizations to hire accordingly to the process needs. (3) Human factoring alleviates the current work related stress that is being encountered. (4) Enable quicker transition for newer employees into job scope. This was made possible by including a human behaviour layer in between pools within a process to depict human behaviour and feeling. Future work includes having a human thought symbol and a human interaction symbol included into the Business Process Modelling Notation (BPMN).
Resumo:
As an Aboriginal woman currently reviewing feminist literature in Australia, I have found that representations of Aboriginal women's gender have been generated predominantly by women anthropologists. Australian feminists utilise this literature in their writing and teaching and accept its truths without question; the most often quoted ethnographic text is Diane Bell's Daughters of the Dreaming (1983a).1 Feminists' lack of critical engagement with this literature implies that they are content to accept women anthropologists' representations because Aboriginal women are not central to their constructions of feminism.2 Instead the Aboriginal woman is positioned on the margins, a symbol of difference; a reminder that it is feminists who are the bearers of true womanhood.
Resumo:
This essay--part of a special issue on the work of Gunther Kress--uses the idea of affordances and constraints to explore the (im)possibilities of new environments for engaging with literature written for children (see Kress, 2003). In particular, it examines a festival of children's literature from an Australian education context that occurs online. The festival is part of a technologically mediated library space designated by the term libr@ry (Kapitzke & Bruce, 2006). The @ symbol (French word "arobase") inserted into the word library indicates that technological mediation has a history, an established set of social practices, and a political economy, which even chatrooms with "real" authors may alter but not fully supplant.
Resumo:
Practice-led or multi modal theses (describing examinable outcomes of postgraduate study which comprise the practice of dancing/choreography with an accompanying exegesis) are an emerging strength of dance scholarship; a form of enquiry that has been gaining momentum for over a decade, particularly in Australia and the United Kingdom. It has been strongly argued that, in this form of research, legitimate claims to new knowledge are embodied predominantly within the practice itself (Pakes, 2003) and that these findings are emergent, contingent and often interstitial, contained within both the material form of the practice and in the symbolic languages surrounding the form. In a recent study on ‘dancing’ theses Phillips, Stock, Vincs (2009) found that there was general agreement from academics and artists that ‘there could be more flexibility in matching written language with conceptual thought expressed in practice’. The authors discuss how the seemingly intangible nature of danced / embodied research, reliant on what Melrose (2003) terms ‘performance mastery’ by the ‘expert practitioner’ (2006, Point 4) involving ‘expert’ intuition (2006, Point 5), might be accessed, articulated and validated in terms of alternative ways of knowing through exploring an ongoing dialogue in which the danced practice develops emergent theory. They also propose ways in which the danced thesis can be ‘converted’ into the required ‘durable’ artefact which the ephemerality of live performance denies, drawing on the work of Rye’s ‘multi-view’ digital record (2003) and Stapleton’s ‘multi-voiced audio visual document’(2006, 82). Building on a two-year research project (2007-2008) Dancing Between Diversity and Consistency: Refining Assessment in Postgraduate Degrees in Dance, which examined such issues in relation to assessment in an Australian context, the three researchers have further explored issues around interdisciplinarity, cultural differences and documentation through engaging with the following questions: How do we represent research in which understandings, meanings and findings are situated within the body of the dancer/choreographer? Do these need a form of ‘translating’ into textual form in order to be accessed as research? What kind of language structures can be developed to effect this translation: metaphor, allusion, symbol? How important is contextualising the creative practice? How do we incorporate differing cultural inflections and practices into our reading and evaluation? What kind of layered documentation can assist in producing a ‘durable’ research artefact from a non-reproduce-able live event?
Resumo:
Aims: This study investigated the effect of simulated visual impairment on the speed and accuracy of performance on a series of commonly used cognitive tests. ----- Methods: Cognitive performance was assessed for 30 young, visually normal subjects (M=22.0yrs ± 3.1 yrs) using the Digit Symbol Substitution Test (DSST), Trail Making Test (TMT) A and B and the Stroop Colour Word Test under three visual conditions: normal vision and two levels of visually degrading filters (VistechTM) administered in a random order. Distance visual acuity and contrast sensitivity were also assessed for each filter condition. ----- Results: The visual filters, which degraded contrast sensitivity to a greater extent than visual acuity, significantly increased the time to complete (p<0.05), but not the number of errors made, on the DSST and the TMT A and B and affected only some components of the Stroop test.----- Conclusions: Reduced contrast sensitivity had a marked effect on the speed but not the accuracy of performance on commonly used cognitive tests, even in young individuals; the implications of these findings are discussed.
Resumo:
The Sydney Harbour Bridge provides an imaginative space that is revisited by Australian writers in particular ways. In this space novelists, poets, and cultural historians negotiate questions of emotional and psychological transformation as well as reflect on social and environmental change in the city of Sydney. The writerly tensions that mark these accounts often alter, or query, representations of the Bridge as a symbol of material progress and demonstrate a complex creative engagement with the Bridge. This discussion of ‘the Bridge’ focuses on the work of four authors, Eleanor Dark, P.R. Stephensen, Peter Carey and Vicki Hastrich and includes a range of other fictional and non-fictional accounts of ‘Bridge-writing.’ The ideas proffered are framed by a theorising of space, especially referencing the work of Michel de Certeau, whose writing on the spatial ambiguity of a bridge is important to the examination of the diverse ways in which Australian writers have engaged with the imaginative potential and almost mythic resonance of the Sydney Harbour Bridge.
Resumo:
Purpose: To evaluate the on-road driving performance of persons with homonymous hemianopia or quadrantanopia in comparison to age-matched controls with normal visual fields. Methods: Participants were 22 hemianopes and eight quadrantanopes (mean age 53 years) and 30 persons with normal visual fields (mean age 52 years) and were either current drivers or aiming to resume driving. All participants completed a battery of tests of vision (ETDRS visual acuity, Pelli-Robson letter contrast sensitivity, Humphrey visual fields), cognitive tests (trials A and B, Mini Mental State Examination, Digit Symbol Substitution) and an on-road driving assessment. Driving performance was assessed in a dual-brake vehicle with safety monitored by a certified driving rehabilitation specialist. Backseat evaluators masked to the clinical characteristics of participants independently rated driving performance along a 22.7 kilometre route involving urban and interstate driving. Results: Seventy-three per cent of the hemianopes, 88 per cent of quadrantanopes and all of the drivers with normal fields received safe driving ratings. Those hemianopic and quadrantanopic drivers rated as unsafe tended to have problems with maintaining appropriate lane position, steering steadiness and gap judgment compared to controls. Unsafe driving was associated with slower visual processing speed and impairments in contrast sensitivity, visual field sensitivity and executive function. Conclusions: Our findings suggest that some drivers with hemianopia or quadrantanopia are capable of safe driving performance, when compared to those of the same age with normal visual fields. This finding has important implications for the assessment of fitness to drive in this population.
Resumo:
This thesis is about the derivation of the addition law on an arbitrary elliptic curve and efficiently adding points on this elliptic curve using the derived addition law. The outcomes of this research guarantee practical speedups in higher level operations which depend on point additions. In particular, the contributions immediately find applications in cryptology. Mastered by the 19th century mathematicians, the study of the theory of elliptic curves has been active for decades. Elliptic curves over finite fields made their way into public key cryptography in late 1980’s with independent proposals by Miller [Mil86] and Koblitz [Kob87]. Elliptic Curve Cryptography (ECC), following Miller’s and Koblitz’s proposals, employs the group of rational points on an elliptic curve in building discrete logarithm based public key cryptosystems. Starting from late 1990’s, the emergence of the ECC market has boosted the research in computational aspects of elliptic curves. This thesis falls into this same area of research where the main aim is to speed up the additions of rational points on an arbitrary elliptic curve (over a field of large characteristic). The outcomes of this work can be used to speed up applications which are based on elliptic curves, including cryptographic applications in ECC. The aforementioned goals of this thesis are achieved in five main steps. As the first step, this thesis brings together several algebraic tools in order to derive the unique group law of an elliptic curve. This step also includes an investigation of recent computer algebra packages relating to their capabilities. Although the group law is unique, its evaluation can be performed using abundant (in fact infinitely many) formulae. As the second step, this thesis progresses the finding of the best formulae for efficient addition of points. In the third step, the group law is stated explicitly by handling all possible summands. The fourth step presents the algorithms to be used for efficient point additions. In the fifth and final step, optimized software implementations of the proposed algorithms are presented in order to show that theoretical speedups of step four can be practically obtained. In each of the five steps, this thesis focuses on five forms of elliptic curves over finite fields of large characteristic. A list of these forms and their defining equations are given as follows: (a) Short Weierstrass form, y2 = x3 + ax + b, (b) Extended Jacobi quartic form, y2 = dx4 + 2ax2 + 1, (c) Twisted Hessian form, ax3 + y3 + 1 = dxy, (d) Twisted Edwards form, ax2 + y2 = 1 + dx2y2, (e) Twisted Jacobi intersection form, bs2 + c2 = 1, as2 + d2 = 1, These forms are the most promising candidates for efficient computations and thus considered in this work. Nevertheless, the methods employed in this thesis are capable of handling arbitrary elliptic curves. From a high level point of view, the following outcomes are achieved in this thesis. - Related literature results are brought together and further revisited. For most of the cases several missed formulae, algorithms, and efficient point representations are discovered. - Analogies are made among all studied forms. For instance, it is shown that two sets of affine addition formulae are sufficient to cover all possible affine inputs as long as the output is also an affine point in any of these forms. In the literature, many special cases, especially interactions with points at infinity were omitted from discussion. This thesis handles all of the possibilities. - Several new point doubling/addition formulae and algorithms are introduced, which are more efficient than the existing alternatives in the literature. Most notably, the speed of extended Jacobi quartic, twisted Edwards, and Jacobi intersection forms are improved. New unified addition formulae are proposed for short Weierstrass form. New coordinate systems are studied for the first time. - An optimized implementation is developed using a combination of generic x86-64 assembly instructions and the plain C language. The practical advantages of the proposed algorithms are supported by computer experiments. - All formulae, presented in the body of this thesis, are checked for correctness using computer algebra scripts together with details on register allocations.
Resumo:
This paper investigates the impact of carrier frequency offset (CFO) on Single Carrier wireless communication systems with Frequency Domain Equalization (SC-FDE). We show that CFO in SC-FDE systems causes irrecoverable channel estimation error, which leads to inter-symbol-interference (ISI). The impact of CFO on SC-FDE and OFDM is compared in the presence of CFO and channel estimation errors. Closed form expressions of signal to interference and noise ratio (SINR) are derived for both systems, and verified by simulation results. We find that when channel estimation errors are considered, SC-FDE is similarly or even more sensitive to CFO, compared to OFDM. In particular, in SC-FDE systems, CFO mainly deteriorates the system performance via degrading the channel estimation. Both analytical and simulation results highlight the importance of accurate CFO estimation in SC-FDE systems.