59 resultados para implementation and complexity theory
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
The treatment of the Random-Phase Approximation Hamiltonians, encountered in different frameworks, like time-dependent density functional theory or Bethe-Salpeter equation, is complicated by their non-Hermicity. Compared to their Hermitian Hamiltonian counterparts, computational methods for the treatment of non-Hermitian Hamiltonians are often less efficient and less stable, sometimes leading to the breakdown of the method. Recently [Gruning et al. Nano Lett. 8 (2009) 28201, we have identified that such Hamiltonians are usually pseudo-Hermitian. Exploiting this property, we have implemented an algorithm of the Lanczos type for Random-Phase Approximation Hamiltonians that benefits from the same stability and computational load as its Hermitian counterpart, and applied it to the study of the optical response of carbon nanotubes. We present here the related theoretical grounds and technical details, and study the performance of the algorithm for the calculation of the optical absorption of a molecule within the Bethe-Salpeter equation framework. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The emergence of Grid computing technology has opened up an unprecedented opportunity for biologists to share and access data, resources and tools in an integrated environment leading to a greater chance of knowledge discovery. GeneGrid is a Grid computing framework that seamlessly integrates a myriad of heterogeneous resources spanning multiple administrative domains and locations. It provides scientists an integrated environment for the streamlined access of a number of bioinformatics programs and databases through a simple and intuitive interface. It acts as a virtual bioinformatics laboratory by allowing scientists to create, execute and manage workflows that represent bioinformatics experiments. A number of cooperating Grid services interact in an orchestrated manner to provide this functionality. This paper gives insight into the details of the architecture, components and implementation of GeneGrid.
Resumo:
This paper explores the law of accidental mixtures of goods. It traces the development of the English rules on mixture from the seminal nineteenth century case of Spence v Union Marine Insurance Co to the present day, and compares their responses to those given by the Roman law, which always has been claimed as an influence on our jurisprudence in this area. It is argued that the different answers given by English and Roman law to essentially the same problems of title result from the differing bases of these legal systems. Roman a priori theory is contrasted with the more practical reasoning of the common law, and while both sets of rules are judged to be coherent on their own terms, it is suggested that the difference between them is reflective of a more general philosophical disagreement about the proper functioning of a legal system, and the relative importance of theoretical and pragmatic considerations.
Resumo:
The analytic advantages of central concepts from linguistics and information theory, and the analogies demonstrated between them, for understanding patterns of retrieval from full-text indexes to documents are developed. The interaction between the syntagm and the paradigm in computational operations on written language in indexing, searching, and retrieval is used to account for transformations of the signified or meaning between documents and their representation and between queries and documents retrieved. Characteristics of the message, and messages for selection for written language, are brought to explain the relative frequency of occurrence of words and multiple word sequences in documents. The examples given in the companion article are revisited and a fuller example introduced. The signified of the sequence stood for, the term classically used in the definitions of the sign, as something standing for something else, can itself change rapidly according to its syntagm. A greater than ordinary discourse understanding of patterns in retrieval is obtained.
Resumo:
An analogy is established between the syntagm and paradigm from Saussurean linguistics and the message and messages for selection from the information theory initiated by Claude Shannon. The analogy is pursued both as an end itself and for its analytic value in understanding patterns of retrieval from full text systems. The multivalency of individual words when isolated from their syntagm is contrasted with the relative stability of meaning of multi-word sequences, when searching ordinary written discourse. The syntagm is understood as the linear sequence of oral and written language. Saussureâ??s understanding of the word, as a unit which compels recognition by the mind, is endorsed, although not regarded as final. The lesser multivalency of multi-word sequences is understood as the greater determination of signification by the extended syntagm. The paradigm is primarily understood as the network of associations a word acquires when considered apart from the syntagm. The restriction of information theory to expression or signals, and its focus on the combinatorial aspects of the message, is sustained. The message in the model of communication in information theory can include sequences of written language. Shannonâ??s understanding of the written word, as a cohesive group of letters, with strong internal statistical influences, is added to the Saussurean conception. Sequences of more than one word are regarded as weakly correlated concatenations of cohesive units.
Resumo:
This paper, chosen as a best paper from the 2005 SAMOS Workshop on Computer Systems: describes the for the first time the major Abhainn project for automated system level design of embedded signal processing systems. In particular, this describes four key novelties: novel algorithm modelling techniques for DSP systems, automated implementation realisation, algorithm transformation for system optimisation and automated inter-processor communication. This is applied to two complex systems: a radar and sonar system. In both cases technology which allows non-experts to automatically create low-overhead, high performance embedded signal processing systems is exhibited.