882 resultados para automatic music analysis
Resumo:
It is convenient and effective to solve nonlinear problems with a model that has a linear-in-the-parameters (LITP) structure. However, the nonlinear parameters (e.g. the width of Gaussian function) of each model term needs to be pre-determined either from expert experience or through exhaustive search. An alternative approach is to optimize them by a gradient-based technique (e.g. Newton’s method). Unfortunately, all of these methods still need a lot of computations. Recently, the extreme learning machine (ELM) has shown its advantages in terms of fast learning from data, but the sparsity of the constructed model cannot be guaranteed. This paper proposes a novel algorithm for automatic construction of a nonlinear system model based on the extreme learning machine. This is achieved by effectively integrating the ELM and leave-one-out (LOO) cross validation with our two-stage stepwise construction procedure [1]. The main objective is to improve the compactness and generalization capability of the model constructed by the ELM method. Numerical analysis shows that the proposed algorithm only involves about half of the computation of orthogonal least squares (OLS) based method. Simulation examples are included to confirm the efficacy and superiority of the proposed technique.
Resumo:
This article has arisen from a research-led production of Translations by Brian Friel for Queen’s University’s Tyrone Guthrie Society in February 2010. Drawing partly on a review of the existing critical literature and also from questions left unresolved by a previous experience of directing the play, the production sought to address through ‘active analysis’ (Merlin 2001) a number of research questions relating to the embodied nature of the rehearsal process and the historicity of Friel’s play. The analysis invokes Bergson (1910), Lefebvre (1991) and Worthern (2006) in establishing a performative correlative for insightful but more literary studies by Connolly (1993), Lojek (1994) and McGrath (1989 & 1999). A detailed account of the rehearsal process helps reveal the extent to which the idea of failure of communication is embedded in the text and embodied in performance, while an experiment with the partial use of the Irish language casts further light on Friel’s extraordinary device of rendering two languages through the medium of one. The use of music to counterpoint, rather than underscore the action, together with an achronological sequence of projected historical images inspired by Andrews (1983) provided me as director a means to challenge the audience’s presuppositions about the play. The sense of palimpsest, of the layered histories, that this evoked also served to highlight Friel’s use of the wider stylistic palette of Anglo-Irish drama, revealing Translations as a forerunner for Stewart Parker’s more explicit formal experiments in Northern Star. In rehearsal and performance Friel’s place in the continuum of the Irish theatrical canon became clear, as stylistic allusions to O’Casey, Shaw, Wilde and Beckett were embodied by the actors on the rehearsal room floor.
Resumo:
Raman spectroscopy with far-red excitation has been used to study seized, tableted samples of MDMA (N-methyl-3,4-methylenedioxyamphetamine) and related compounds (MDA, MDEA, MBDB, 2C-B and amphetamine sulfate), as well as pure standards of these drugs. We have found that by using far-red (785 nm) excitation the level of fluorescence background even in untreated seized samples is sufficiently low that there is little difficulty in obtaining good quality data with moderate 2 min data accumulation times. The spectra can be used to distinguish between even chemically-similar substances, such as the geometrical isomers MDEA and MBDB, and between different polymorphic/hydrated forms of the same drug. Moreover, these differences can be found even in directly recorded spectra of seized samples which have been bulked with other materials, giving a rapid and non-destructive method for drug identification. The spectra can be processed to give unambiguous identification of both drug and excipients (even when more than one compound has been used as the bulking agent) and the relative intensities of drug and excipient bands can be used for quantitative or at least semi-quantitative analysis. Finally, the simple nature of the measurements lends itself to automatic sample handling so that sample throughputs of 20 samples per hour can be achieved with no real difficulty.
Resumo:
Speeding up sequential programs on multicores is a challenging problem that is in urgent need of a solution. Automatic parallelization of irregular pointer-intensive codes, exempli?ed by the SPECint codes, is a very hard problem. This paper shows that, with a helping hand, such auto-parallelization is possible and fruitful. This paper makes the following contributions: (i) A compiler framework for extracting pipeline-like parallelism from outer program loops is presented. (ii) Using a light-weight programming model based on annotations, the programmer helps the compiler to ?nd thread-level parallelism. Each of the annotations speci?es only a small piece of semantic information that compiler analysis misses, e.g. stating that a variable is dead at a certain program point. The annotations are designed such that correctness is easily veri?ed. Furthermore, we present a tool for suggesting annotations to the programmer. (iii) The methodology is applied to autoparallelize several SPECint benchmarks. For the benchmark with most parallelism (hmmer), we obtain a scalable 7-fold speedup on an AMD quad-core dual processor. The annotations constitute a parallel programming model that relies extensively on a sequential program representation. Hereby, the complexity of debugging is not increased and it does not obscure the source code. These properties could prove valuable to increase the ef?ciency of parallel programming.
Resumo:
In this paper, a novel approach to automatically sub-divide a complex geometry and apply an efficient mesh is presented. Following the identification and removal of thin-sheet regions from an arbitrary solid using the thick/thin decomposition approach developed by Robinson et al. [1], the technique here employs shape metrics generated using local sizing measures to identify long-slender regions within the thick body. A series of algorithms automatically partition the thick region into a non-manifold assembly of long-slender and complex sub-regions. A structured anisotropic mesh is applied to the thin-sheet and long-slender bodies, and the remaining complex bodies are filled with unstructured isotropic tetrahedra. The resulting semi-structured mesh possesses significantly fewer degrees of freedom than the equivalent unstructured mesh, demonstrating the effectiveness of the approach. The accuracy of the efficient meshes generated for a complex geometry is verified via a study that compares the results of a modal analysis with the results of an equivalent analysis on a dense tetrahedral mesh.
Resumo:
The effect of restructuring the form of three unfamiliar pop/rock songs was investigated in two experiments. In the first experiment, listeners' judgements of the likely location of sections of novel popular songs were explored by requiring participants to place the eight sections (Intro - Verse 1 - Chorus 1 - Verse 2 - Chorus 2 - Bridge (solo) - Chorus 3 - Extro) of the songs into the locations they thought them most likely to occur within the song. Results revealed that participants were able to place the sections in approximately the right location with some accuracy, though they were unable to differentiate between choruses. In Experiment 2, three versions of each of the songs were presented in three different structures: intact (original form), medium restructured (the sections in a moderately changed order), and highly restructured (more severe restructuring). The results show that listeners' judgments of predictability and liking were largely uninfluenced by the restructuring of the songs, in line with findings for classical music. Moment-by-moment liking judgements of the songs demonstrated a change in liking judgements with repeated exposure, though the trend was downwards with repeated exposure rather than upwards. Detailed analysis of moment-by-moment judgements at the ends and beginnings of sections showed that listeners were able to respond quickly to intact songs, but not to restructured songs. The results suggest that concatenism prevails in listening to popular song at the expense of paying attention to larger structural features. © 2012 by the regents of the university of california all rights reserved.
Resumo:
Music has always been used as an important dramaturgical strategy in Western theatre to create a holistic theatrical experience. In Shakespeare’s plays, music was employed as a unique dramaturgical device for various purposes. Twelfth Night distinguishes itself from among the many plays that employ music because it begins, ends and progresses with music. Music pervades Twelfth Night and is tightly interwoven into the thematic concerns of the play such as love and gender. Because of music’s elusive nature and the difficulty of discussing a musical aesthetics, Shakespearean music critics have approached music in the play as a theme or an idea. This paper hopes to develop upon older scholarship by introducing an alternate framework of considering music’s musicality through a musicological analysis of the songs in Twelfth Night. In so doing, the paper hopes to show how and why music can modulate our responses to the play and in particular, to the theme of gender, a problematic issue that produces the elusive and darker nature of this festive comedy.
Resumo:
Composite damage modelling with cohesive elements has initially been limited to the analysis of interface damage or delamination. However, their use is also being extended to the analysis of inplane tensile failure arising from matrix or fibre fracture. These interface elements are typically placed at locations where failure is likely to occur, which infers a certain a priori knowledge of the crack propagation path(s). In the case of a crack jump for example, the location of the jump is usually not obvious, and the simulation would require the placement of cohesive elements at all element faces. A better option, presented here, is to determine the potential location of cohesive elements and insert them during the analysis. The aim of this work is to enable the determination of the crack path, as part of the solution process. A subroutine has been developed and implemented in the commercial finite element package ABAQUS/Standard[1] in order to automatically insert cohesive elements within a pristine model, on the basis of the analysis of the current stress field. Results for the prediction of delamination are presented in this paper.
Resumo:
The creation of idealised, dimensionally reduced meshes for preliminary design and optimisation remains a time-consuming, manual task. A dimensionally reduced model is ideal for assessing design changes through modification of element properties without the need to create a new geometry or mesh. In this paper, a novel approach for automating the creation of mixed dimensional meshes is presented. The input to the process is a solid model which has been decomposed into a non-manifold assembly of smaller volumes with different meshing significance. Associativity between the original solid model and the dimensionally reduced equivalent is maintained. The approach is validated by means of a free-free modal analysis on an output mesh of a gas turbine engine component of industrial complexity. Extensions and enhancements to this work are also discussed.
Resumo:
In this paper we give first account of a simple analysis tool for modeling temporal compression for automatic mitigation of multipath induced intersymbol interference through the use of active phase conjugation (APC) technique. The temporal compression characteristics of an APC system is analyzed using a simple discrete channel model, and numerical results are provided to justify the theoretical findings.
Resumo:
Anti-islanding protection is becoming increasingly important due to the rapid installation of distributed generation from renewable resources like wind, tidal and wave, solar PV, bio-fuels, as well as from other resources like diesel. Unintentional islanding presents a potential risk for damaging utility plants and equipment connected from the demand side, as well as to public and personnel in utility plants. This paper investigates automatic islanding detection. This is achieved by deploying a statistical process control approach for fault detection with the real-time data acquired through a wide area measurement system, which is based on Phasor Measurement Unit (PMU) technology. In particular, the principal component analysis (PCA) is used to project the data into principal component subspace and residual space, and two statistics are used to detect the occurrence of fault. Then a fault reconstruction method is used to identify the fault and its development over time. The proposed scheme has been used in a real system and the results have confirmed that the proposed method can correctly identify the fault and islanding site.
Resumo:
Pressure myography studies have played a crucial role in our understanding of vascular physiology and pathophysiology. Such studies depend upon the reliable measurement of changes in the diameter of isolated vessel segments over time. Although several software packages are available to carry out such measurements on small arteries and veins, no such software exists to study smaller vessels (<50 µm in diameter). We provide here a new, freely available open-source algorithm, MyoTracker, to measure and track changes in the diameter of small isolated retinal arterioles. The program has been developed as an ImageJ plug-in and uses a combination of cost analysis and edge enhancement to detect the vessel walls. In tests performed on a dataset of 102 images, automatic measurements were found to be comparable to those of manual ones. The program was also able to track both fast and slow constrictions and dilations during intraluminal pressure changes and following application of several drugs. Variability in automated measurements during analysis of videos and processing times were also investigated and are reported. MyoTracker is a new software to assist during pressure myography experiments on small isolated retinal arterioles. It provides fast and accurate measurements with low levels of noise and works with both individual images and videos. Although the program was developed to work with small arterioles, it is also capable of tracking the walls of other types of microvessels, including venules and capillaries. It also works well with larger arteries, and therefore may provide an alternative to other packages developed for larger vessels when its features are considered advantageous.
Resumo:
Background: Failure to recruit sufficient numbers of participants to randomized controlled trials is a common and serious problem. This problem may be additionally acute in music therapy research.
Objective: To use the experience of conducting a large randomized controlled trial of music therapy for young people with emotional and behavioral difficulties to illustrate the strategies that can be used to optimize recruitment; to report on the success or otherwise of those strategies; and to draw general conclusions about the most effective approaches.
Methods: Review of the methodological literature, and a narrative account and realist analysis of the recruitment process.
Results: The strategies adopted led to the achievement of the recruitment target of 250 subjects, but only with an extension to the recruitment period. In the pre-protocol stage of the research, these strategies included the engagement of non-music therapy clinical investigators, and extensive consultation with clinical stakeholders. In the protocol development and initial recruitment stages, they involved a search of systematic reviews of factors leading to under-recruitment and of interventions to promote recruitment, and the incorporation of their insights into the research protocol and practices. In the latter stages of recruitment, various stakeholders including clinicians, senior managers and participant representatives were consulted in an attempt to uncover the reasons for the low recruitment levels that the research was experiencing.
Conclusions: The primary mechanisms to promote recruitment are education, facilitation, audit and feedback, and time allowed. The primary contextual factors affecting the effectiveness of these mechanisms are professional culture and organizational support.
Resumo:
Classification methods with embedded feature selection capability are very appealing for the analysis of complex processes since they allow the analysis of root causes even when the number of input variables is high. In this work, we investigate the performance of three techniques for classification within a Monte Carlo strategy with the aim of root cause analysis. We consider the naive bayes classifier and the logistic regression model with two different implementations for controlling model complexity, namely, a LASSO-like implementation with a L1 norm regularization and a fully Bayesian implementation of the logistic model, the so called relevance vector machine. Several challenges can arise when estimating such models mainly linked to the characteristics of the data: a large number of input variables, high correlation among subsets of variables, the situation where the number of variables is higher than the number of available data points and the case of unbalanced datasets. Using an ecological and a semiconductor manufacturing dataset, we show advantages and drawbacks of each method, highlighting the superior performance in term of classification accuracy for the relevance vector machine with respect to the other classifiers. Moreover, we show how the combination of the proposed techniques and the Monte Carlo approach can be used to get more robust insights into the problem under analysis when faced with challenging modelling conditions.