18 resultados para Computational music theory
Resumo:
Transcript of a Panel Discussion at the Dartmouth Symposium, chaired by Eric Lyon.
Resumo:
A simple logic of conditional preferences is defined, with a language that allows the compact representation of certain kinds of conditional preference statements, a semantics and a proof theory. CP-nets and TCP-nets can be mapped into this logic, and the semantics and proof theory generalise those of CP-nets and TCP-nets. The system can also express preferences of a lexicographic kind. The paper derives various sufficient conditions for a set of conditional preferences to be consistent, along with algorithmic techniques for checking such conditions and hence confirming consistency. These techniques can also be used for totally ordering outcomes in a way that is consistent with the set of preferences, and they are further developed to give an approach to the problem of constrained optimisation for conditional preferences.
Resumo:
The analytic advantages of central concepts from linguistics and information theory, and the analogies demonstrated between them, for understanding patterns of retrieval from full-text indexes to documents are developed. The interaction between the syntagm and the paradigm in computational operations on written language in indexing, searching, and retrieval is used to account for transformations of the signified or meaning between documents and their representation and between queries and documents retrieved. Characteristics of the message, and messages for selection for written language, are brought to explain the relative frequency of occurrence of words and multiple word sequences in documents. The examples given in the companion article are revisited and a fuller example introduced. The signified of the sequence stood for, the term classically used in the definitions of the sign, as something standing for something else, can itself change rapidly according to its syntagm. A greater than ordinary discourse understanding of patterns in retrieval is obtained.
Resumo:
This paper discusses the calculation of electron impact collision strengths and effective collision strengths for iron peak elements of importance in the analysis of many astronomical and laboratory spectra. It commences with a brief overview of R-matrix theory which is the basis of computer programs which have been widely used to calculate the relevant atomic data used in this analysis. A summary is then given of calculations carried out over the last 20 y for electron collisions with Fe II. The grand challenge, represented by the calculation of accurate collision strengths and effective collision strengths for this ion, is then discussed. A new parallel R-matrix program PRMAT, which is being developed to meet this challenge, is then described and results of recent calculations, using this program to determine optically forbidden transitions in e- – Ni IV on a Cray T3E-1200 parallel supercomputer, are presented. The implications of this e- – Ni IV calculation for the determination of accurate data from an isoelectronic e- – Fe II calculation are discussed and finally some future directions of research are reviewed.
Resumo:
The carbazole moiety is a component of many important pharmaceuticals including anticancer and anti-HIV agents and is commonly utilized in the production of modern polymeric materials with novel photophysical and electronic properties. Simple carbazoles are generally produced via the aromatization of the respective tetrahydrocarbazole (THCZ). In this work, density functional theory calculations are used to model the reaction pathway of tetrahydrocarbazole aromatization over Pd(111). The geometry of each of the intermediate surface species has been determined and how each structure interacts with the metal surface addressed. The reaction energies and barriers of each of the elementary surface reactions have also been calculated, and a detailed analysis of the energetic trends performed. Our calculations have shown that the surface intermediates remain fixed to the surface via the aromatic ring in a manner similar to that of THCZ. Moreover, the aliphatic ring becomes progressively more planer with the dissociation of each subsequent hydrogen atom. Analysis of the reaction energy profile has revealed that the trend in reaction barriers is determined by the two factors: (i) the strength of the dissociating ring-H bond and (ii) the subsequent gain in energy due to the geometric relaxation of the aliphatic ring. (c) 2008 American Institute of Physics.
Resumo:
"In this special issue's opening essay, Martin Dowling devotes almost half of "'Thought-Tormented Music': Joyce and the Music of the Irish Revival" to what he calls "the situation of music in the Irish literary revival." He focuses chiefly on 1904, which was both an intensely productive period for the revival movement and a year chock-full of crucial events and decisions for Joyce. Drawing on the works of Pierre Bourdieu and Jaques Lacan, Dowling explores the revivalists' efforts to "de-anglicize" Irish music, to remove foreign influences that distorted the "pure tradition of Irish song," and to achieve an improbable harmony between the music favoured by the disappearing Anglo-Irish aristocracy and the Irish-speaking peasantry. Inevitably, disputes occurred over what constituted "authentic" Irish music. Factions quarrelled over whether pristine Irish music existed in the Atlantic seaboard or more inland; whether "authentic" songs were sung with or without instrumental accompaniment; and whether the piano, rather than the traditional harp, was a legitimate instrument of accompaniment. Having delineated the historical and theoretical context, Dowling offers a richly detailed analysis of Joyce's story "A Mother." He reveals how almost every element in the story--from the Eire Abu Society to the Antient Concert Rooms, from the conflict between Mrs. Kearney and Hoppy Holohan to the plight of Kathleen Kearney--is charged with meaning by the subtextual conflicts of the revivalists' agenda. Dowling explains also the "authenticity" in Joyce's depiction of vocal performances of "The Lass of Aughrim" in "The Dead" and "The Croppy Boy" in "Sirens," which he calls two "true gems" of authentic Irish music." --Introduction by Charles Rossman and Alan W. Friedman, Guest Editors, pp. 409-410
Resumo:
We introduce a novel graph class we call universal hierarchical graphs (UHG) whose topology can be found numerously in problems representing, e.g., temporal, spacial or general process structures of systems. For this graph class we show, that we can naturally assign two probability distributions, for nodes and for edges, which lead us directly to the definition of the entropy and joint entropy and, hence, mutual information establishing an information theory for this graph class. Furthermore, we provide some results under which conditions these constraint probability distributions maximize the corresponding entropy. Also, we demonstrate that these entropic measures can be computed efficiently which is a prerequisite for every large scale practical application and show some numerical examples. (c) 2007 Elsevier Inc. All rights reserved.
Resumo:
This paper describes the deployment on GPUs of PROP, a program of the 2DRMP suite which models electron collisions with H-like atoms and ions. Because performance on GPUs is better in single precision than in double precision, the numerical stability of the PROP program in single precision has been studied. The numerical quality of PROP results computed in single precision and their impact on the next program of the 2DRMP suite has been analyzed. Successive versions of the PROP program on GPUs have been developed in order to improve its performance. Particular attention has been paid to the optimization of data transfers and of linear algebra operations. Performance obtained on several architectures (including NVIDIA Fermi) are presented.
Resumo:
In a 1999 essay, J.M. Balkin and Sanford Levinson called for law to be considered as a performing art. Against or perhaps going further than Balkin and Levinson, this commentary claims that while engagement with performance practices in the arts, such as music, is of the utmost value to law and legal theory, we must not take for granted what it means to ‘‘perform’’. Uniting Jacques Derrida’s la Villette performance (with jazz legend, Ornette Coleman) with his writings on performativity in law, this commentary looks to the musical practice of improvisation to trouble the notion of performance as immediate and singular and to question taken for granted distinctions between text and performance, writing and music, composition and improvisation. The consequence of this refined understanding of the performative on legal theory and the actual practice of law is a reconceptualization of law as improvisation, that is, both singular and general, pre-existent and immediate, and a refocusing on the creativity that lies at the heart of law’s conservativism.
Resumo:
Various scientific studies have explored the causes of violent behaviour from different perspectives, with psychological tests, in particular, applied to the analysis of crime factors. The relationship between bi-factors has also been extensively studied including the link between age and crime. In reality, many factors interact to contribute to criminal behaviour and as such there is a need to have a greater level of insight into its complex nature. In this article we analyse violent crime information systems containing data on psychological, environmental and genetic factors. Our approach combines elements of rough set theory with fuzzy logic and particle swarm optimisation to yield an algorithm and methodology that can effectively extract multi-knowledge from information systems. The experimental results show that our approach outperforms alternative genetic algorithm and dynamic reduct-based techniques for reduct identification and has the added advantage of identifying multiple reducts and hence multi-knowledge (rules). Identified rules are consistent with classical statistical analysis of violent crime data and also reveal new insights into the interaction between several factors. As such, the results are helpful in improving our understanding of the factors contributing to violent crime and in highlighting the existence of hidden and intangible relationships between crime factors.
Resumo:
Traditional internal combustion engine vehicles are a major contributor to global greenhouse gas emissions and other air pollutants, such as particulate matter and nitrogen oxides. If the tail pipe point emissions could be managed centrally without reducing the commercial and personal user functionalities, then one of the most attractive solutions for achieving a significant reduction of emissions in the transport sector would be the mass deployment of electric vehicles. Though electric vehicle sales are still hindered by battery performance, cost and a few other technological bottlenecks, focused commercialisation and support from government policies are encouraging large scale electric vehicle adoptions. The mass proliferation of plug-in electric vehicles is likely to bring a significant additional electric load onto the grid creating a highly complex operational problem for power system operators. Electric vehicle batteries also have the ability to act as energy storage points on the distribution system. This double charge and storage impact of many uncontrollable small kW loads, as consumers will want maximum flexibility, on a distribution system which was originally not designed for such operations has the potential to be detrimental to grid balancing. Intelligent scheduling methods if established correctly could smoothly integrate electric vehicles onto the grid. Intelligent scheduling methods will help to avoid cycling of large combustion plants, using expensive fossil fuel peaking plant, match renewable generation to electric vehicle charging and not overload the distribution system causing a reduction in power quality. In this paper, a state-of-the-art review of scheduling methods to integrate plug-in electric vehicles are reviewed, examined and categorised based on their computational techniques. Thus, in addition to various existing approaches covering analytical scheduling, conventional optimisation methods (e.g. linear, non-linear mixed integer programming and dynamic programming), and game theory, meta-heuristic algorithms including genetic algorithm and particle swarm optimisation, are all comprehensively surveyed, offering a systematic reference for grid scheduling considering intelligent electric vehicle integration.
Resumo:
The energetics of the low-temperature adsorption and decomposition of nitrous oxide, N(2)O, on flat and stepped platinum surfaces were calculated using density-functional theory (DFT). The results show that the preferred adsorption site for N(2)O is an atop site, bound upright via the terminal nitrogen. The molecule is only weakly chemisorbed to the platinum surface. The decomposition barriers on flat (I 11) surfaces and stepped (211) surfaces are similar. While the barrier for N(2)O dissociation is relatively small, the surface rapidly becomes poisoned by adsorbed oxygen. These findings are supported by experimental results of pulsed N(2)O decomposition with 5% Pt/SiO(2) and bismuth-modified Pt/C catalysts. At low temperature, decomposition occurs but self-poisoning by O((ads)) prevents further decomposition. At higher temperatures some desorption Of O(2) is observed, allowing continued catalytic activity. The study with bismuth-modified Pt/C catalysts showed that, although the activation barriers calculated for both terraces and steps were similar, the actual rate was different for the two surfaces. Steps were found experimentally to be more active than terraces and this is attributed to differences in the preexponential term. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Recent renewed interest in computational writer identification has resulted in an increased number of publications. In relation to historical musicology its application has so far been limited. One of the obstacles seems to be that the clarity of the images from the scans available for computational analysis is often not sufficient. In this paper, the use of the Hinge feature is proposed to avoid segmentation and staff-line removal for effective feature extraction from low quality scans. The use of an auto encoder in Hinge feature space is suggested as an alternative to staff-line removal by image processing, and their performance is compared. The result of the experiment shows an accuracy of 87 % for the dataset containing 84 writers’ samples, and superiority of our segmentation and staff-line removal free approach. Practical analysis on Bach’s autograph manuscript of the Well-Tempered Clavier II (Additional MS. 35021 in the British Library, London) is also presented and the extensive applicability of our approach is demonstrated.