27 resultados para FORMALISM
Resumo:
This article argues that copyright law is not just a creature of statute, but it is also a social and imaginative contruct. It evaluates a number of critiques of legal formalism. Part 1 examines whether the positive rules and principles of copyright law are the product of historical contingency and political expediency. Part 2 considers the social operation of copyright law in terms of its material effects and cultural significance. Part 3 investigates the future of copyright law, in light of the politics of globalisation and the impact of new information technologies.
Resumo:
This thesis proposes that contemporary printmaking, at its most significant, marks the present through reconstructing pasts and anticipating futures. It argues this through examples in the field, occurring in contexts beyond the Euramerican (Europe and North America). The arguments revolve around how the practice of a number of significant artists in Japan, Australia and Thailand has generated conceptual and formal innovations in printmaking that transcend local histories and conventions, whilst paradoxically, also building upon them and creating new meanings. The arguments do not portray the relations between contemporary and traditional art as necessarily antagonistic but rather, as productively dialectical. Furthermore, the case studies demonstrate that, in the 1980s and 1990s particularly, the studio practice of these printmakers was informed by other visual arts disciplines and reflected postmodern concerns. Departures from convention witnessed in these countries within the Asia-Pacific region shifted the field of the print into a heterogeneous and hybrid realm. The practitioners concerned (especially in Thailand) produced work that was more readily equated with performance and installation art than with printmaking per se. In Japan, the incursion of photography interrupted the decorative cast of printmaking and delivered it from a straightforward, craft-based aesthetic. In Australia, fixed notions of national identity were challenged by print practitioners through deliberate cultural rapprochements and technical contradictions (speaking across old and new languages).However time-honoured print methods were not jettisoned by any case study artists. Their re-alignment of the fundamental attributes of printmaking, in line with materialist formalism, is a core consideration of my arguments. The artists selected for in-depth analysis from these three countries are all innovators whose geographical circumstances and creative praxis drew on local traditions whilst absorbing international trends. In their radical revisionism, they acknowledged the specificity of history and place, conditions of contingency and forces of globalisation. The transformational nature of their work during the late twentieth century connects it to the postmodern ethos and to a broader artistic and cultural nexus than has hitherto been recognised in literature on the print. Emerging from former guild-based practices, they ambitiously conceived their work to be part of a continually evolving visual arts vocabulary. I argue in this thesis that artists from the Asia-Pacific region have historically broken with the hermetic and Euramerican focus that has generally characterised the field. Inadequate documentation and access to print activity outside the dominant centres of critical discourse imply that readings of postmodernism have been too limited in their scope of inquiry. Other locations offer complexities of artistic practice where re-alignments of customary boundaries are often the norm. By addressing innovative activity in Japan, Australia and Thailand, this thesis exposes the need for a more inclusive theoretical framework and wider global reach than currently exists for ‘printmaking’.
Resumo:
Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum-like model of the human mental lexicon, and shows one set of recent experimental data suggesting that concept combinations can indeed behave non-separably. There is some reason to believe that the human mental lexicon displays entanglement.
Resumo:
Using sculpture and drawing as my primary methods of investigation, this research explores ways of shifting the emphasis of my creative visual arts practice from object to process whilst still maintaining a primacy of material outcomes. My motivation was to locate ways of developing a sustained practice shaped as much by new works, as by a creative flow between works. I imagined a practice where a logic of structure within discrete forms and a logic of the broader practice might be developed as mutually informed processes. Using basic structural components of multiple wooden curves and linear modes of deployment – in both sculptures and drawings – I have identified both emergence theory and the image of rhizomic growth (Deleuze and Guattari, 1987) as theoretically integral to this imagining of a creative practice, both in terms of critiquing and developing works. Whilst I adopt a formalist approach for this exegesis, the emergence and rhizome models allow it to work as a critique of movement, of becoming and changing, rather than merely a formalism of static structure. In these models, therefore, I have identified a formal approach that can be applied not only to objects, but to practice over time. The thorough reading and application of these ontological models (emergence and rhizome) to visual arts practice, in terms of processes, objects and changes, is the primary contribution of this thesis. The works that form the major component of the research develop, reflect and embody these notions of movement and change.
Resumo:
We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.
Resumo:
Proving security of cryptographic schemes, which normally are short algorithms, has been known to be time-consuming and easy to get wrong. Using computers to analyse their security can help to solve the problem. This thesis focuses on methods of using computers to verify security of such schemes in cryptographic models. The contributions of this thesis to automated security proofs of cryptographic schemes can be divided into two groups: indirect and direct techniques. Regarding indirect ones, we propose a technique to verify the security of public-key-based key exchange protocols. Security of such protocols has been able to be proved automatically using an existing tool, but in a noncryptographic model. We show that under some conditions, security in that non-cryptographic model implies security in a common cryptographic one, the Bellare-Rogaway model [11]. The implication enables one to use that existing tool, which was designed to work with a different type of model, in order to achieve security proofs of public-key-based key exchange protocols in a cryptographic model. For direct techniques, we have two contributions. The first is a tool to verify Diffie-Hellmanbased key exchange protocols. In that work, we design a simple programming language for specifying Diffie-Hellman-based key exchange algorithms. The language has a semantics based on a cryptographic model, the Bellare-Rogaway model [11]. From the semantics, we build a Hoare-style logic which allows us to reason about the security of a key exchange algorithm, specified as a pair of initiator and responder programs. The other contribution to the direct technique line is on automated proofs for computational indistinguishability. Unlike the two other contributions, this one does not treat a fixed class of protocols. We construct a generic formalism which allows one to model the security problem of a variety of classes of cryptographic schemes as the indistinguishability between two pieces of information. We also design and implement an algorithm for solving indistinguishability problems. Compared to the two other works, this one covers significantly more types of schemes, but consequently, it can verify only weaker forms of security.
Resumo:
In this paper, general order conditions and a global convergence proof are given for stochastic Runge Kutta methods applied to stochastic ordinary differential equations ( SODEs) of Stratonovich type. This work generalizes the ideas of B-series as applied to deterministic ordinary differential equations (ODEs) to the stochastic case and allows a completely general formalism for constructing high order stochastic methods, either explicit or implicit. Some numerical results will be given to illustrate this theory.
Resumo:
Biological systems exhibit a wide range of contextual effects, and this often makes it difficult to construct valid mathematical models of their behaviour. In particular, mathematical paradigms built upon the successes of Newtonian physics make assumptions about the nature of biological systems that are unlikely to hold true. After discussing two of the key assumptions underlying the Newtonian paradigm, we discuss two key aspects of the formalism that extended it, Quantum Theory (QT). We draw attention to the similarities between biological and quantum systems, motivating the development of a similar formalism that can be applied to the modelling of biological processes.
Resumo:
We present a tool for automatic analysis of computational indistinguishability between two strings of information. This is designed as a generic tool for proving cryptographic security based on a formalism that provides computational soundness preservation. The tool has been implemented and tested successfully with several cryptographic schemes.
Resumo:
Social tagging systems are shown to evidence a well known cognitive heuristic, the guppy effect, which arises from the combination of different concepts. We present some empirical evidence of this effect, drawn from a popular social tagging Web service. The guppy effect is then described using a quantum inspired formalism that has been already successfully applied to model conjunction fallacy and probability judgement errors. Key to the formalism is the concept of interference, which is able to capture and quantify the strength of the guppy effect.
Resumo:
In the last years several works have investigated a formal model for Information Retrieval (IR) based on the mathematical formalism underlying quantum theory. These works have mainly exploited geometric and logical–algebraic features of the quantum formalism, for example entanglement, superposition of states, collapse into basis states, lattice relationships. In this poster I present an analogy between a typical IR scenario and the double slit experiment. This experiment exhibits the presence of interference phenomena between events in a quantum system, causing the Kolmogorovian law of total probability to fail. The analogy allows to put forward the routes for the application of quantum probability theory in IR. However, several questions need still to be addressed; they will be the subject of my PhD research
Resumo:
Complex numbers are a fundamental aspect of the mathematical formalism of quantum physics. Quantum-like models developed outside physics often overlooked the role of complex numbers. Specifically, previous models in Information Retrieval (IR) ignored complex numbers. We argue that to advance the use of quantum models of IR, one has to lift the constraint of real-valued representations of the information space, and package more information within the representation by means of complex numbers. As a first attempt, we propose a complex-valued representation for IR, which explicitly uses complex valued Hilbert spaces, and thus where terms, documents and queries are represented as complex-valued vectors. The proposal consists of integrating distributional semantics evidence within the real component of a term vector; whereas, ontological information is encoded in the imaginary component. Our proposal has the merit of lifting the role of complex numbers from a computational byproduct of the model to the very mathematical texture that unifies different levels of semantic information. An empirical instantiation of our proposal is tested in the TREC Medical Record task of retrieving cohorts for clinical studies.
Resumo:
Formal representations of business processes are used for analysis of the process behavior. Workflow nets are a widely used formalism for describing the behavior of business processes. Structure theory of processes investigates the relation between the structure of a model and its behavior. In this paper, we propose to employ the connectivity property of workflow nets as an angle to their structural analysis. In particular, we show how soundness verification can be organized using biconnected components of a workflow net. This allows for efficient identification and localization of flaws in the behavior of workflow nets and for supporting process analysts with diagnostic information
Resumo:
Process models are usually depicted as directed graphs, with nodes representing activities and directed edges control flow. While structured processes with pre-defined control flow have been studied in detail, flexible processes including ad-hoc activities need further investigation. This paper presents flexible process graph, a novel approach to model processes in the context of dynamic environment and adaptive process participants’ behavior. The approach allows defining execution constraints, which are more restrictive than traditional ad-hoc processes and less restrictive than traditional control flow, thereby balancing structured control flow with unstructured ad-hoc activities. Flexible process graph focuses on what can be done to perform a process. Process participants’ routing decisions are based on the current process state. As a formal grounding, the approach uses hypergraphs, where each edge can associate any number of nodes. Hypergraphs are used to define execution semantics of processes formally. We provide a process scenario to motivate and illustrate the approach.
Resumo:
The motion of marine vessels has traditionally been studied using two different approaches: manoeuvring and seakeeping. These two approaches use different reference frames and coordinate systems to describe the motion. This paper derives the kinematic models that characterize the transformation of motion variables (position, velocity, accelerations) and forces between the different coordinate systems used in these theories. The derivations hereby presented are done in terms of the formalism adopted in robotics. The advantage of this formulation is the use of matrix notation and operations. As an application, the transformation of linear equations of motion used in seakeeping into body-fixed coordinates is considered for both zero and forward speed.