32 resultados para Galilean covariant formalism
em Queensland University of Technology - ePrints Archive
Resumo:
This article argues that copyright law is not just a creature of statute, but it is also a social and imaginative contruct. It evaluates a number of critiques of legal formalism. Part 1 examines whether the positive rules and principles of copyright law are the product of historical contingency and political expediency. Part 2 considers the social operation of copyright law in terms of its material effects and cultural significance. Part 3 investigates the future of copyright law, in light of the politics of globalisation and the impact of new information technologies.
Resumo:
This thesis proposes that contemporary printmaking, at its most significant, marks the present through reconstructing pasts and anticipating futures. It argues this through examples in the field, occurring in contexts beyond the Euramerican (Europe and North America). The arguments revolve around how the practice of a number of significant artists in Japan, Australia and Thailand has generated conceptual and formal innovations in printmaking that transcend local histories and conventions, whilst paradoxically, also building upon them and creating new meanings. The arguments do not portray the relations between contemporary and traditional art as necessarily antagonistic but rather, as productively dialectical. Furthermore, the case studies demonstrate that, in the 1980s and 1990s particularly, the studio practice of these printmakers was informed by other visual arts disciplines and reflected postmodern concerns. Departures from convention witnessed in these countries within the Asia-Pacific region shifted the field of the print into a heterogeneous and hybrid realm. The practitioners concerned (especially in Thailand) produced work that was more readily equated with performance and installation art than with printmaking per se. In Japan, the incursion of photography interrupted the decorative cast of printmaking and delivered it from a straightforward, craft-based aesthetic. In Australia, fixed notions of national identity were challenged by print practitioners through deliberate cultural rapprochements and technical contradictions (speaking across old and new languages).However time-honoured print methods were not jettisoned by any case study artists. Their re-alignment of the fundamental attributes of printmaking, in line with materialist formalism, is a core consideration of my arguments. The artists selected for in-depth analysis from these three countries are all innovators whose geographical circumstances and creative praxis drew on local traditions whilst absorbing international trends. In their radical revisionism, they acknowledged the specificity of history and place, conditions of contingency and forces of globalisation. The transformational nature of their work during the late twentieth century connects it to the postmodern ethos and to a broader artistic and cultural nexus than has hitherto been recognised in literature on the print. Emerging from former guild-based practices, they ambitiously conceived their work to be part of a continually evolving visual arts vocabulary. I argue in this thesis that artists from the Asia-Pacific region have historically broken with the hermetic and Euramerican focus that has generally characterised the field. Inadequate documentation and access to print activity outside the dominant centres of critical discourse imply that readings of postmodernism have been too limited in their scope of inquiry. Other locations offer complexities of artistic practice where re-alignments of customary boundaries are often the norm. By addressing innovative activity in Japan, Australia and Thailand, this thesis exposes the need for a more inclusive theoretical framework and wider global reach than currently exists for ‘printmaking’.
Resumo:
Camera calibration information is required in order for multiple camera networks to deliver more than the sum of many single camera systems. Methods exist for manually calibrating cameras with high accuracy. Manually calibrating networks with many cameras is, however, time consuming, expensive and impractical for networks that undergo frequent change. For this reason, automatic calibration techniques have been vigorously researched in recent years. Fully automatic calibration methods depend on the ability to automatically find point correspondences between overlapping views. In typical camera networks, cameras are placed far apart to maximise coverage. This is referred to as a wide base-line scenario. Finding sufficient correspondences for camera calibration in wide base-line scenarios presents a significant challenge. This thesis focuses on developing more effective and efficient techniques for finding correspondences in uncalibrated, wide baseline, multiple-camera scenarios. The project consists of two major areas of work. The first is the development of more effective and efficient view covariant local feature extractors. The second area involves finding methods to extract scene information using the information contained in a limited set of matched affine features. Several novel affine adaptation techniques for salient features have been developed. A method is presented for efficiently computing the discrete scale space primal sketch of local image features. A scale selection method was implemented that makes use of the primal sketch. The primal sketch-based scale selection method has several advantages over the existing methods. It allows greater freedom in how the scale space is sampled, enables more accurate scale selection, is more effective at combining different functions for spatial position and scale selection, and leads to greater computational efficiency. Existing affine adaptation methods make use of the second moment matrix to estimate the local affine shape of local image features. In this thesis, it is shown that the Hessian matrix can be used in a similar way to estimate local feature shape. The Hessian matrix is effective for estimating the shape of blob-like structures, but is less effective for corner structures. It is simpler to compute than the second moment matrix, leading to a significant reduction in computational cost. A wide baseline dense correspondence extraction system, called WiDense, is presented in this thesis. It allows the extraction of large numbers of additional accurate correspondences, given only a few initial putative correspondences. It consists of the following algorithms: An affine region alignment algorithm that ensures accurate alignment between matched features; A method for extracting more matches in the vicinity of a matched pair of affine features, using the alignment information contained in the match; An algorithm for extracting large numbers of highly accurate point correspondences from an aligned pair of feature regions. Experiments show that the correspondences generated by the WiDense system improves the success rate of computing the epipolar geometry of very widely separated views. This new method is successful in many cases where the features produced by the best wide baseline matching algorithms are insufficient for computing the scene geometry.
Resumo:
Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum-like model of the human mental lexicon, and shows one set of recent experimental data suggesting that concept combinations can indeed behave non-separably. There is some reason to believe that the human mental lexicon displays entanglement.
Resumo:
Using sculpture and drawing as my primary methods of investigation, this research explores ways of shifting the emphasis of my creative visual arts practice from object to process whilst still maintaining a primacy of material outcomes. My motivation was to locate ways of developing a sustained practice shaped as much by new works, as by a creative flow between works. I imagined a practice where a logic of structure within discrete forms and a logic of the broader practice might be developed as mutually informed processes. Using basic structural components of multiple wooden curves and linear modes of deployment – in both sculptures and drawings – I have identified both emergence theory and the image of rhizomic growth (Deleuze and Guattari, 1987) as theoretically integral to this imagining of a creative practice, both in terms of critiquing and developing works. Whilst I adopt a formalist approach for this exegesis, the emergence and rhizome models allow it to work as a critique of movement, of becoming and changing, rather than merely a formalism of static structure. In these models, therefore, I have identified a formal approach that can be applied not only to objects, but to practice over time. The thorough reading and application of these ontological models (emergence and rhizome) to visual arts practice, in terms of processes, objects and changes, is the primary contribution of this thesis. The works that form the major component of the research develop, reflect and embody these notions of movement and change.
Resumo:
Over the last three years, in our Early Algebra Thinking Project, we have been studying Years 3 to 5 students’ ability to generalise in a variety of situations, namely, compensation principles in computation, the balance principle in equivalence and equations, change and inverse change rules with function machines, and pattern rules with growing patterns. In these studies, we have attempted to involve a variety of models and representations and to build students’ abilities to switch between them (in line with the theories of Dreyfus, 1991, and Duval, 1999). The results have shown the negative effect of closure on generalisation in symbolic representations, the predominance of single variance generalisation over covariant generalisation in tabular representations, and the reduced ability to readily identify commonalities and relationships in enactive and iconic representations. This chapter uses the results to explore the interrelation between generalisation and verbal and visual comprehension of context. The studies evidence the importance of understanding and communicating aspects of representational forms which allowed commonalities to be seen across or between representations. Finally the chapter explores the implications of the studies for a theory that describes a growth in integration of models and representations that leads to generalisation.
Resumo:
Robust, affine covariant, feature extractors provide a means to extract correspondences between images captured by widely separated cameras. Advances in wide baseline correspondence extraction require looking beyond the robust feature extraction and matching approach. This study examines new techniques of extracting correspondences that take advantage of information contained in affine feature matches. Methods of improving the accuracy of a set of putative matches, eliminating incorrect matches and extracting large numbers of additional correspondences are explored. It is assumed that knowledge of the camera geometry is not available and not immediately recoverable. The new techniques are evaluated by means of an epipolar geometry estimation task. It is shown that these methods enable the computation of camera geometry in many cases where existing feature extractors cannot produce sufficient numbers of accurate correspondences.
Resumo:
Local image feature extractors that select local maxima of the determinant of Hessian function have been shown to perform well and are widely used. This paper introduces the negative local minima of the determinant of Hessian function for local feature extraction. The properties and scale-space behaviour of these features are examined and found to be desirable for feature extraction. It is shown how this new feature type can be implemented along with the existing local maxima approach at negligible extra processing cost. Applications to affine covariant feature extraction and sub-pixel precise corner extraction are demonstrated. Experimental results indicate that the new corner detector is more robust to image blur and noise than existing methods. It is also accurate for a broader range of corner geometries. An affine covariant feature extractor is implemented by combining the minima of the determinant of Hessian with existing scale and shape adaptation methods. This extractor can be implemented along side the existing Hessian maxima extractor simply by finding both minima and maxima during the initial extraction stage. The minima features increase the number of correspondences by two to four fold. The additional minima features are very distinct from the maxima features in descriptor space and do not make the matching process more ambiguous.
Resumo:
Affine covariant local image features are a powerful tool for many applications, including matching and calibrating wide baseline images. Local feature extractors that use a saliency map to locate features require adaptation processes in order to extract affine covariant features. The most effective extractors make use of the second moment matrix (SMM) to iteratively estimate the affine shape of local image regions. This paper shows that the Hessian matrix can be used to estimate local affine shape in a similar fashion to the SMM. The Hessian matrix requires significantly less computation effort than the SMM, allowing more efficient affine adaptation. Experimental results indicate that using the Hessian matrix in conjunction with a feature extractor that selects features in regions with high second order gradients delivers equivalent quality correspondences in less than 17% of the processing time, compared to the same extractor using the SMM.
Resumo:
We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.
Resumo:
Proving security of cryptographic schemes, which normally are short algorithms, has been known to be time-consuming and easy to get wrong. Using computers to analyse their security can help to solve the problem. This thesis focuses on methods of using computers to verify security of such schemes in cryptographic models. The contributions of this thesis to automated security proofs of cryptographic schemes can be divided into two groups: indirect and direct techniques. Regarding indirect ones, we propose a technique to verify the security of public-key-based key exchange protocols. Security of such protocols has been able to be proved automatically using an existing tool, but in a noncryptographic model. We show that under some conditions, security in that non-cryptographic model implies security in a common cryptographic one, the Bellare-Rogaway model [11]. The implication enables one to use that existing tool, which was designed to work with a different type of model, in order to achieve security proofs of public-key-based key exchange protocols in a cryptographic model. For direct techniques, we have two contributions. The first is a tool to verify Diffie-Hellmanbased key exchange protocols. In that work, we design a simple programming language for specifying Diffie-Hellman-based key exchange algorithms. The language has a semantics based on a cryptographic model, the Bellare-Rogaway model [11]. From the semantics, we build a Hoare-style logic which allows us to reason about the security of a key exchange algorithm, specified as a pair of initiator and responder programs. The other contribution to the direct technique line is on automated proofs for computational indistinguishability. Unlike the two other contributions, this one does not treat a fixed class of protocols. We construct a generic formalism which allows one to model the security problem of a variety of classes of cryptographic schemes as the indistinguishability between two pieces of information. We also design and implement an algorithm for solving indistinguishability problems. Compared to the two other works, this one covers significantly more types of schemes, but consequently, it can verify only weaker forms of security.
Resumo:
In this paper, general order conditions and a global convergence proof are given for stochastic Runge Kutta methods applied to stochastic ordinary differential equations ( SODEs) of Stratonovich type. This work generalizes the ideas of B-series as applied to deterministic ordinary differential equations (ODEs) to the stochastic case and allows a completely general formalism for constructing high order stochastic methods, either explicit or implicit. Some numerical results will be given to illustrate this theory.
Resumo:
Biological systems exhibit a wide range of contextual effects, and this often makes it difficult to construct valid mathematical models of their behaviour. In particular, mathematical paradigms built upon the successes of Newtonian physics make assumptions about the nature of biological systems that are unlikely to hold true. After discussing two of the key assumptions underlying the Newtonian paradigm, we discuss two key aspects of the formalism that extended it, Quantum Theory (QT). We draw attention to the similarities between biological and quantum systems, motivating the development of a similar formalism that can be applied to the modelling of biological processes.
Resumo:
We present a tool for automatic analysis of computational indistinguishability between two strings of information. This is designed as a generic tool for proving cryptographic security based on a formalism that provides computational soundness preservation. The tool has been implemented and tested successfully with several cryptographic schemes.
Resumo:
Social tagging systems are shown to evidence a well known cognitive heuristic, the guppy effect, which arises from the combination of different concepts. We present some empirical evidence of this effect, drawn from a popular social tagging Web service. The guppy effect is then described using a quantum inspired formalism that has been already successfully applied to model conjunction fallacy and probability judgement errors. Key to the formalism is the concept of interference, which is able to capture and quantify the strength of the guppy effect.