10 resultados para Software Transactional Memory (STM)

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the rapid growth of the Internet and digital communications, the volume of sensitive electronic transactions being transferred and stored over and on insecure media has increased dramatically in recent years. The growing demand for cryptographic systems to secure this data, across a multitude of platforms, ranging from large servers to small mobile devices and smart cards, has necessitated research into low cost, flexible and secure solutions. As constraints on architectures such as area, speed and power become key factors in choosing a cryptosystem, methods for speeding up the development and evaluation process are necessary. This thesis investigates flexible hardware architectures for the main components of a cryptographic system. Dedicated hardware accelerators can provide significant performance improvements when compared to implementations on general purpose processors. Each of the designs proposed are analysed in terms of speed, area, power, energy and efficiency. Field Programmable Gate Arrays (FPGAs) are chosen as the development platform due to their fast development time and reconfigurable nature. Firstly, a reconfigurable architecture for performing elliptic curve point scalar multiplication on an FPGA is presented. Elliptic curve cryptography is one such method to secure data, offering similar security levels to traditional systems, such as RSA, but with smaller key sizes, translating into lower memory and bandwidth requirements. The architecture is implemented using different underlying algorithms and coordinates for dedicated Double-and-Add algorithms, twisted Edwards algorithms and SPA secure algorithms, and its power consumption and energy on an FPGA measured. Hardware implementation results for these new algorithms are compared against their software counterparts and the best choices for minimum area-time and area-energy circuits are then identified and examined for larger key and field sizes. Secondly, implementation methods for another component of a cryptographic system, namely hash functions, developed in the recently concluded SHA-3 hash competition are presented. Various designs from the three rounds of the NIST run competition are implemented on FPGA along with an interface to allow fair comparison of the different hash functions when operating in a standardised and constrained environment. Different methods of implementation for the designs and their subsequent performance is examined in terms of throughput, area and energy costs using various constraint metrics. Comparing many different implementation methods and algorithms is nontrivial. Another aim of this thesis is the development of generic interfaces used both to reduce implementation and test time and also to enable fair baseline comparisons of different algorithms when operating in a standardised and constrained environment. Finally, a hardware-software co-design cryptographic architecture is presented. This architecture is capable of supporting multiple types of cryptographic algorithms and is described through an application for performing public key cryptography, namely the Elliptic Curve Digital Signature Algorithm (ECDSA). This architecture makes use of the elliptic curve architecture and the hash functions described previously. These components, along with a random number generator, provide hardware acceleration for a Microblaze based cryptographic system. The trade-off in terms of performance for flexibility is discussed using dedicated software, and hardware-software co-design implementations of the elliptic curve point scalar multiplication block. Results are then presented in terms of the overall cryptographic system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Great demand in power optimized devices shows promising economic potential and draws lots of attention in industry and research area. Due to the continuously shrinking CMOS process, not only dynamic power but also static power has emerged as a big concern in power reduction. Other than power optimization, average-case power estimation is quite significant for power budget allocation but also challenging in terms of time and effort. In this thesis, we will introduce a methodology to support modular quantitative analysis in order to estimate average power of circuits, on the basis of two concepts named Random Bag Preserving and Linear Compositionality. It can shorten simulation time and sustain high accuracy, resulting in increasing the feasibility of power estimation of big systems. For power saving, firstly, we take advantages of the low power characteristic of adiabatic logic and asynchronous logic to achieve ultra-low dynamic and static power. We will propose two memory cells, which could run in adiabatic and non-adiabatic mode. About 90% dynamic power can be saved in adiabatic mode when compared to other up-to-date designs. About 90% leakage power is saved. Secondly, a novel logic, named Asynchronous Charge Sharing Logic (ACSL), will be introduced. The realization of completion detection is simplified considerably. Not just the power reduction improvement, ACSL brings another promising feature in average power estimation called data-independency where this characteristic would make power estimation effortless and be meaningful for modular quantitative average case analysis. Finally, a new asynchronous Arithmetic Logic Unit (ALU) with a ripple carry adder implemented using the logically reversible/bidirectional characteristic exhibiting ultra-low power dissipation with sub-threshold region operating point will be presented. The proposed adder is able to operate multi-functionally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Error correcting codes are combinatorial objects, designed to enable reliable transmission of digital data over noisy channels. They are ubiquitously used in communication, data storage etc. Error correction allows reconstruction of the original data from received word. The classical decoding algorithms are constrained to output just one codeword. However, in the late 50’s researchers proposed a relaxed error correction model for potentially large error rates known as list decoding. The research presented in this thesis focuses on reducing the computational effort and enhancing the efficiency of decoding algorithms for several codes from algorithmic as well as architectural standpoint. The codes in consideration are linear block codes closely related to Reed Solomon (RS) codes. A high speed low complexity algorithm and architecture are presented for encoding and decoding RS codes based on evaluation. The implementation results show that the hardware resources and the total execution time are significantly reduced as compared to the classical decoder. The evaluation based encoding and decoding schemes are modified and extended for shortened RS codes and software implementation shows substantial reduction in memory footprint at the expense of latency. Hermitian codes can be seen as concatenated RS codes and are much longer than RS codes over the same aphabet. A fast, novel and efficient VLSI architecture for Hermitian codes is proposed based on interpolation decoding. The proposed architecture is proven to have better than Kötter’s decoder for high rate codes. The thesis work also explores a method of constructing optimal codes by computing the subfield subcodes of Generalized Toric (GT) codes that is a natural extension of RS codes over several dimensions. The polynomial generators or evaluation polynomials for subfield-subcodes of GT codes are identified based on which dimension and bound for the minimum distance are computed. The algebraic structure for the polynomials evaluating to subfield is used to simplify the list decoding algorithm for BCH codes. Finally, an efficient and novel approach is proposed for exploiting powerful codes having complex decoding but simple encoding scheme (comparable to RS codes) for multihop wireless sensor network (WSN) applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis critically investigates the divergent international approaches to the legal regulation of the patentability of computer software inventions, with a view to identifying the reforms necessary for a certain, predictable and uniform inter-jurisdictional system of protection. Through a critical analysis of the traditional and contemporary US and European regulatory frameworks of protection for computer software inventions, this thesis demonstrates the confusion and legal uncertainty resulting from ill-defined patent laws and inconsistent patent practices as to the scope of the “patentable subject matter” requirement, further compounded by substantial flaws in the structural configuration of the decision-making procedures within which the patent systems operate. This damaging combination prevents the operation of an accessible and effective Intellectual Property (IP) legal framework of protection for computer software inventions, capable of securing adequate economic returns for inventors whilst preserving the necessary scope for innovation and competition in the field, to the ultimate benefit of society. In exploring the substantive and structural deficiencies in the European and US regulatory frameworks, this thesis develops to ultimately highlight that the best approach to the reform of the legal regulation of software patentability is two-tiered. It demonstrates that any reform to achieve international legal harmony first requires the legislature to individually clarify (Europe) or restate (US) the long-standing inadequate rules governing the scope of software “patentable subject matter”, together with the reorganisation of the unworkable structural configuration of the decision-making procedures. Informed by the critical analysis of the evolution of the “patentable subject matter” requirement for computer software in the US, this thesis particularly considers the potential of the reforms of the European patent system currently underway, to bring about certainty, predictability and uniformity in the legal treatment of computer software inventions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For at least two millennia and probably much longer, the traditional vehicle for communicating geographical information to end-users has been the map. With the advent of computers, the means of both producing and consuming maps have radically been transformed, while the inherent nature of the information product has also expanded and diversified rapidly. This has given rise in recent years to the new concept of geovisualisation (GVIS), which draws on the skills of the traditional cartographer, but extends them into three spatial dimensions and may also add temporality, photorealistic representations and/or interactivity. Demand for GVIS technologies and their applications has increased significantly in recent years, driven by the need to study complex geographical events and in particular their associated consequences and to communicate the results of these studies to a diversity of audiences and stakeholder groups. GVIS has data integration, multi-dimensional spatial display advanced modelling techniques, dynamic design and development environments and field-specific application needs. To meet with these needs, GVIS tools should be both powerful and inherently usable, in order to facilitate their role in helping interpret and communicate geographic problems. However no framework currently exists for ensuring this usability. The research presented here seeks to fill this gap, by addressing the challenges of incorporating user requirements in GVIS tool design. It starts from the premise that usability in GVIS should be incorporated and implemented throughout the whole design and development process. To facilitate this, Subject Technology Matching (STM) is proposed as a new approach to assessing and interpreting user requirements. Based on STM, a new design framework called Usability Enhanced Coordination Design (UECD) is ten presented with the purpose of leveraging overall usability of the design outputs. UECD places GVIS experts in a new key role in the design process, to form a more coordinated and integrated workflow and a more focused and interactive usability testing. To prove the concept, these theoretical elements of the framework have been implemented in two test projects: one is the creation of a coastal inundation simulation for Whitegate, Cork, Ireland; the other is a flooding mapping tool for Zhushan Town, Jiangsu, China. The two case studies successfully demonstrated the potential merits of the UECD approach when GVIS techniques are applied to geographic problem solving and decision making. The thesis delivers a comprehensive understanding of the development and challenges of GVIS technology, its usability concerns, usability and associated UCD; it explores the possibility of putting UCD framework in GVIS design; it constructs a new theoretical design framework called UECD which aims to make the whole design process usability driven; it develops the key concept of STM into a template set to improve the performance of a GVIS design. These key conceptual and procedural foundations can be built on future research, aimed at further refining and developing UECD as a useful design methodology for GVIS scholars and practitioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robert Briscoe was the Dublin born son of Lithuanian and German-Jewish immigrants. As a young man he joined Sinn Féin and was an important figure in the War of Independence due to a role as one of the IRA’s main gun-procuring agents. He took the anti-Treaty side during an internecine Civil War, mainly due to the influence of Eamon de Valera and retained a filial devotion towards him for the rest of his life. In 1926 he was a founding member of Fianna Fáil, de Valera’s breakaway republican party, which would dominate twentieth-century Irish politics. He was first elected as a Fianna Fáil T.D. (Teachta Dála, Deputy to the Dáil) in 1927, and successfully defended his seat eleven times becoming the first Jewish Lord Mayor of Dublin in 1956, an honour that was repeated in 1961. On this basis alone, it can be argued that Briscoe was a significant presence in an embryonic Irish political culture; however, when his role in the 1930s Jewish immigration endeavor is acknowledged, it is clear that he played a unique part in one of the most contentious political and social discourses of the pre-war years. This was reinforced when Briscoe embraced Zionism in a belated realisation that the survival of his European co-religionists could only be guaranteed if an independent Jewish state existed. This information is to a certain degree public knowledge; however, the full extent of his involvement as an immigration advocate for potential Jewish refugees, and the seniority he achieved in the New Zionist Organisation (Revisionists) has not been fully recognised. This is partly explicable because researchers have based their assessment of Briscoe on an incomplete political archive in the National Library of Ireland (NLI). The vast majority of documentation pertaining to his involvement in the immigration endeavor has not been available to scholars and remains the private property of Robert Briscoe’s son, Ben Briscoe. The lack of immigration files in the NLI was reinforced by the fact that information about Briscoe’s Revisionist engagement was donated to the Jabotinsky Institute in Tel Aviv and can only be accessed physically by visiting Israel. Therefore, even though these twin endeavors have been commented on by a number of academics, their assessments have tended to be based on an incomplete archive, which was supplemented by Briscoe’s autobiographical memoir published in 1958. This study will attempt to fill in the missing gaps in Briscoe’s complex political narrative by incorporating the rarely used private papers of Robert Briscoe, and the difficult to access Briscoe files in Tel Aviv. This undertaking was only possible when Mr.Ben Briscoe graciously granted me full and unrestricted access to his father’s papers, and after a month-long research trip to the Jabotinsky Institute in Tel Aviv. Access to this rarely used documentation facilitated a holistic examination of Briscoe’s complex and multifaceted political reality. It revealed the full extent of Briscoe’s political and social evolution as the Nazi instigated Jewish emigration crisis reached catastrophic proportions. He was by turn Fianna Fáil nationalist, Jewish immigration advocate and senior Revisionist actor on a global stage. The study will examine the contrasting political and social forces that initiated each stage of Briscoe’s Zionist awakening, and in the process will fill a major gap in Irish-Jewish historiography by revealing the full extent of his Revisionist engagement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In moments of rapid social changes, as has been witnessed in Ireland in the last decade, the conditions through which people engage with their localities though memory, individually and collectively, remains an important cultural issue with key implications for questions of heritage, preservation and civic identity. In recent decades, cultural geographers have argued that landscape is more than just a view or a static text of something symbolic. The emphasis seems to be on landscape as a dynamic cultural process – an ever-evolving process being constructed and re-constructed. Hence, landscape seems to be a highly complex term that carries many different meanings. Material, form, relationships or actions have different meanings in different settings. Drawing upon recent and continuing scholarly debates in cultural landscapes and collective memory, this thesis sets out to examine the generation of collective memory and how it is employed as a cultural tool in the production of memory in the landscape. More specifically, the research considers the relationships between landscape and memory, investigating the ways in which places are produced, appropriated, experienced, sensed, acknowledged, imagined, yearned for, appropriated, re-appropriated, contested and identified with. A polyvocal-bricoleur approach aims to get below the surface of a cultural landscape, inject historical research and temporal depth into cultural landscape studies and instil a genuine sense of inclusivity of a wide variety of voices (role of monuments and rituals and voices of people) from the past and present. The polyvocal-bricoleur approach inspires a mixed method methodology approach to fieldsites through archival research, fieldwork and filmed interviews. Using a mixture of mini-vignettes of place narratives in the River Lee valley in the south of Ireland, the thesis explores a number of questions on the fluid nature of narrative in representing the story and role of the landscape in memory-making. The case studies in the Lee Valley are harnessed to investigate the role of the above questions/ themes/ debates in the act of memory making at sites ranging from an Irish War of Independence memorial to the River Lee’s hydroelectric scheme to the valley’s key religious pilgrimage site. The thesis investigates the idea that that the process of landscape extends not only across space but also across time – that the concept of historical continuity and the individual and collective human engagement and experience of this continuity are central to the processes of remembering on the landscape. In addition the thesis debates the idea that the production of landscape is conditioned by several social frames of memory – that individuals remember according to several social frames that give emphasis to different aspects of the reality of human experience. The thesis also reflects on how the process of landscape is represented by those who re-produce its narratives in various media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing number of software development projects successfully exhibit a mix of agile and traditional software development methodologies. Many of these mixed methodologies are organization specific and tailored to a specific project. Our objective in this research-in-progress paper is to develop an artifact that can guide the development of such a mixed methodology. Using control theory, we design a process model that provides theoretical guidance to build a portfolio of controls that can support the development of a mixed methodology for software development. Controls, embedded in methods, provide a generalizable and adaptable framework for project managers to develop their mixed methodology specific to the demands of the project. A research methodology is proposed to test the model. Finally, future directions and contributions are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a landmark book published in 2000, the sociologist Danièle Hervieu-Léger defined religion as a chain of memory, by which she meant that within religious communities remembered traditions are transmitted with an overpowering authority from generation to generation. After analysing Hervieu-Léger’s sociological approach as overcoming the dichotomy between substantive and functional definitions, this article compares a ritual honouring the ancestors in which a medium becomes possessed by the senior elder’s ancestor spirit among the Shona of Zimbabwe with a cleansing ritual performed by a Celtic shaman in New Hampshire, USA. In both instances, despite different social and historical contexts, appeals are made to an authoritative tradition to legitimize the rituals performed. This lends support to the claim that the authoritative transmission of a remembered tradition, by exercising an overwhelming power over communities, even if the memory of such a tradition is merely postulated, identifies the necessary and essential component for any activity to be labelled “religious”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New compensation methods are presented that can greatly reduce the slit errors (i.e. transition location errors) and interval errors induced due to non-idealities in optical incremental encoders (square-wave). An M/T-type, constant sample-time digital tachometer (CSDT) is selected for measuring the velocity of the sensor drives. Using this data, three encoder compensation techniques (two pseudoinverse based methods and an iterative method) are presented that improve velocity measurement accuracy. The methods do not require precise knowledge of shaft velocity. During the initial learning stage of the compensation algorithm (possibly performed in-situ), slit errors/interval errors are calculated through pseudoinversebased solutions of simple approximate linear equations, which can provide fast solutions, or an iterative method that requires very little memory storage. Subsequent operation of the motion system utilizes adjusted slit positions for more accurate velocity calculation. In the theoretical analysis of the compensation of encoder errors, encoder error sources such as random electrical noise and error in estimated reference velocity are considered. Initially, the proposed learning compensation techniques are validated by implementing the algorithms in MATLAB software, showing a 95% to 99% improvement in velocity measurement. However, it is also observed that the efficiency of the algorithm decreases with the higher presence of non-repetitive random noise and/or with the errors in reference velocity calculations. The performance improvement in velocity measurement is also demonstrated experimentally using motor-drive systems, each of which includes a field-programmable gate array (FPGA) for CSDT counting/timing purposes, and a digital-signal-processor (DSP). Results from open-loop velocity measurement and closed-loop servocontrol applications, on three optical incremental square-wave encoders and two motor drives, are compiled. While implementing these algorithms experimentally on different drives (with and without a flywheel) and on encoders of different resolutions, slit error reductions of 60% to 86% are obtained (typically approximately 80%).