780 resultados para music cataloging
Resumo:
Television’s long-form storytelling has the potential to allow the rippling of music across episodes and seasons in interesting ways. In the integration of narrative, music and meaning found in The O.C. (Fox, FOX 2003-7), popular song’s allusive and referential qualities are drawn upon to particularly televisual ends. At times embracing its ‘disruptive’ presence, at others suturing popular music into narrative, at times doing both at once. With television studies largely lacking theories of music, this chapter draws on film music theory and close textual analysis to analyse some of the programme's music moments in detail. In particular it considers the series-spanning use of Jeff Buckley’s cover of ‘Hallelujah’ (and its subsequent oppressive presence across multiple televisual texts), the end of episode musical montage and the use of recurring song fragments as theme within single episodes. In doing so it highlights music's role in the fragmentation and flow of the television aesthetic and popular song’s structural presence in television narrative. Illustrating the multiplicity of popular song’s use in television, these moments demonstrate song’s ability to provide narrative commentary, yet also make particular use of what Ian Garwood describes as the ability of ‘a non-diegetic song to exceed the emotional range displayed by diegetic characters’ (2003:115), to ‘speak’ for characters or to their feelings, contributing to both teen TV’s melodramatic affect and narrative expression.
Resumo:
The feedback mechanism used in a brain-computer interface (BCI) forms an integral part of the closed-loop learning process required for successful operation of a BCI. However, ultimate success of the BCI may be dependent upon the modality of the feedback used. This study explores the use of music tempo as a feedback mechanism in BCI and compares it to the more commonly used visual feedback mechanism. Three different feedback modalities are compared for a kinaesthetic motor imagery BCI: visual, auditory via music tempo, and a combined visual and auditory feedback modality. Visual feedback is provided via the position, on the y-axis, of a moving ball. In the music feedback condition, the tempo of a piece of continuously generated music is dynamically adjusted via a novel music-generation method. All the feedback mechanisms allowed users to learn to control the BCI. However, users were not able to maintain as stable control with the music tempo feedback condition as they could in the visual feedback and combined conditions. Additionally, the combined condition exhibited significantly less inter-user variability, suggesting that multi-modal feedback may lead to more robust results. Finally, common spatial patterns are used to identify participant-specific spatial filters for each of the feedback modalities. The mean optimal spatial filter obtained for the music feedback condition is observed to be more diffuse and weaker than the mean spatial filters obtained for the visual and combined feedback conditions.
Resumo:
This paper presents an EEG study into the neural correlates of music-induced emotions. We presented participants with a large dataset containing musical pieces in different styles, and asked them to report on their induced emotional responses. We found neural correlates of music-induced emotion in a number of frequencies over the pre-frontal cortex. Additionally, we found a set of patterns of functional connectivity, defined by inter-channel coherence measures,to be significantly different between groups of music-induced emotional responses.
Resumo:
The neural mechanisms of music listening and appreciation are not yet completely understood. Based on the apparent relationship between the beats per minute (tempo) of music and the desire to move (for example feet tapping) induced while listening to that music it is hypothesised that musical tempo may evoke movement related activity in the brain. Participants are instructed to listen, without moving, to a large range of musical pieces spanning a range of styles and tempos during an electroencephalogram (EEG) experiment. Event-related desynchronisation (ERD) in the EEG is observed to correlate significantly with the variance of the tempo of the musical stimuli. This suggests that the dynamics of the beat of the music may induce movement related brain activity in the motor cortex. Furthermore, significant correlations are observed between EEG activity in the alpha band over the motor cortex and the bandpower of the music in the same frequency band over time. This relationship is observed to correlate with the strength of the ERD, suggesting entrainment of motor cortical activity relates to increased ERD strength
Resumo:
In Indian classical music, ragas constitute specific combinations of tonic intervals potentially capable of evoking distinct emotions. A raga composition is typically presented in two modes, namely, alaap and gat. Alaap is the note by note delineation of a raga bound by a slow tempo, but not bound by a rhythmic cycle. Gat on the other hand is rendered at a faster tempo and follows a rhythmic cycle. Our primary objective was to (1) discriminate the emotions experienced across alaap and gat of ragas, (2) investigate the association of tonic intervals, tempo and rhythmic regularity with emotional response. 122 participants rated their experienced emotion across alaap and gat of 12 ragas. Analysis of the emotional responses revealed that (1) ragas elicit distinct emotions across the two presentation modes, and (2) specific tonic intervals are robust predictors of emotional response. Specifically, our results showed that the ‘minor second’ is a direct predictor of negative valence. (3) Tonality determines the emotion experienced for a raga where as rhythmic regularity and tempo modulate levels of arousal. Our findings provide new insights into the emotional response to Indian ragas and the impact of tempo, rhythmic regularity and tonality on it.
Resumo:
A Brain-computer music interface (BCMI) is developed to allow for continuous modification of the tempo of dynamically generated music. Six out of seven participants are able to control the BCMI at significant accuracies and their performance is observed to increase over time.
Resumo:
Weather is frequently used in music to frame events and emotions, yet quantitative analyses are rare. From a collated base set of 759 weather-related songs, 419 were analysed based on listings from a karaoke database. This article analyses the 20 weather types described, frequency of occurrence, genre, keys, mimicry, lyrics and songwriters. Vocals were the principal means of communicating weather: sunshine was the most common, followed by rain, with weather depictions linked to the emotions of the song. Bob Dylan, John Lennon and Paul McCartney wrote the most weather-related songs, partly following their experiences at the time of writing.