31 resultados para beat gesture
Resumo:
Background
When we move along in time with a piece of music, we synchronise the downward phase of our gesture with the beat. While it is easy to demonstrate this tendency, there is considerable debate as to its neural origins. It may have a structural basis, whereby the gravitational field acts as an orientation reference that biases the formulation of motor commands. Alternatively, it may be functional, and related to the economy with which motion assisted by gravity can be generated by the motor system.
Methodology/Principal Findings
We used a robotic system to generate a mathematical model of the gravitational forces acting upon the hand, and then to reverse the effect of gravity, and invert the weight of the limb. In these circumstances, patterns of coordination in which the upward phase of rhythmic hand movements coincided with the beat of a metronome were more stable than those in which downward movements were made on the beat. When a normal gravitational force was present, movements made down-on-the-beat were more stable than those made up-on-the-beat.
Conclusions/Significance
The ubiquitous tendency to make a downward movement on a musical beat arises not from the perception of gravity, but as a result of the economy of action that derives from its exploitation.
Resumo:
In a recent study, we reported that the accurate perception of beat structure in music ('perception of musical meter') accounted for over 40% of the variance in single word reading in children with and without dyslexia (Huss et al., 2011). Performance in the musical task was most strongly associated with the auditory processing of rise time, even though beat structure was varied by manipulating the duration of the musical notes.
Resumo:
The prevailing paradigm for researching sensorimotor synchronisation in humans involves finger tapping and temporal accuracy measures. However, many successful sensorimotor synchronisation actions require not only to be 'in time', but also to be in a predefined spatial position. Reaching this spatial position in many everyday actions often exceeds the average amplitude of a finger movement. The aim of this study is to address how people coordinate their movement to be in the right place at the right time when the scale of the movement varies. Does the scale of the movement and accuracy demands of the movement change the ability to accurately synchronise? To address these questions, a sensorimotor synchronisation task with three different inter-beat intervals, two different movement amplitudes and two different target widths was used. Our experiment demonstrated that people use different timing strategies-employing either a movement strategy (varying movement time) or a waiting strategy (keeping movement time constant) for large-scale movements. Those two strategies were found to be equally successful in terms of temporal accuracy and variability (spread of errors). With longer interval durations (2.5 and 3.5 s), variability of sensorimotor synchronisation performance increased (measured as the spread of errors). Analysing the data using the Vorberg and Wing (Handbook of perception and action. Academic Press, New York, pp 181-262, 1996) model shows a need to develop further existing timing models of sensorimotor synchronisation so that they could apply to large-scale movements, where different movement strategies naturally emerge.
Resumo:
The significance of the “physicality” involved in learning to play a musical instrument and the essential role of teachers are areas in need of research. This article explores the role of gesture within teacher–student communicative interaction in one-to-one piano lessons. Three teachers were required to teach a pre-selected repertoire of two contrasting pieces to three students studying piano grade 1. The data was collected by video recordings of piano lessons and analysis based on the type and frequency of gestures employed by teachers in association to teaching behaviours specifying where gestures fit under (or evade) predefined classifications. Spontaneous co-musical gestures were observed in the process of piano tuition emerging with similar general communicative purposes as spontaneous co-verbal gestures and were essential for the process of musical communication between teachers and students. Observed frequencies of categorized gestures varied significantly between different teaching behaviours and between the three teachers. Parallels established between co-verbal and co-musical spontaneous gestures lead to an argument for extension of McNeill’s (2005) ideas of imagery–language–dialectic to imagery–music–dialectic with relevant implications for piano pedagogy and fields of study invested in musical communication.
Resumo:
Teachers’ communication of musical knowledge through physical gesture represents a valuable pedagogical field in need of investigation. This exploratory case study compares the gestural behaviour of three piano teachers while giving individual lessons to students who differed according to piano proficiency levels. The data was collected by video recordings of one-to-one piano lessons and gestures were categorized using two gesture classifications: the spontaneous co-verbal gesture classification (McNeill, 1992; 2005) and spontaneous co-musical gesture classification (Simones, Schroeder & Rodger, 2013). Poisson regression analysis and qualitative observation suggest a relationship between teachers’ didactic intentions and the types of gesture they produced while teaching, as shown by differences in gestural category frequency between teaching students of higher and lower levels of proficiency. Such reported agreement between teachers’ gestural approach in relation to student proficiency levels indicates a teachers’ gestural scaffolding approach whereby teachers adapted gestural communicative channels to suit students’ specific conceptual skill levels.
Resumo:
Moving to a rhythm necessitates precise timing between the movement of the chosen limb and the timing imposed by the beats. However, the temporal information specifying the moment when a beat will sound (the moment onto which one must synchronise one's movement) is not continuously provided by the acoustic array. Because of this informational void, the actors need some form of prospective information that will allow them to act sufficiently ahead of time in order to get their hand in the right place at the right time. In this acoustic interception study, where participants were asked to move between two targets in such a way that they arrived and stopped in the target zone at the same time as a beat sounded, we tested a model derived from tau-coupling theory (Lee DN (1998) Ecol Psychol 10:221-250). This model attempts to explain the form of a potential timing guide that specifies the duration of the inter-beat intervals and also describes how this informational guide can be used in the timing and guidance of movements. The results of our first experiment show that, for inter-beat intervals of less than 3 s, a large proportion of the movement (over 70%) can be explained by the proposed model. However, a second experiment, which augments the time between beats so that it surpasses 3 s, shows a marked decline in the percentage of information/movement coupling. A close analysis of the movement kinematics indicates a lack of control and anticipation in the participants' movements. The implications of these findings, in light of other research studies, are discussed.
Resumo:
Augmented visual feedback can have a profound bearing on the stability of bimanual coordination. Indeed, this has been used to render tractable the study of patterns of coordination that cannot otherwise be produced in a stable fashion. In previous investigations (Carson et al. 1999), we have shown that rhythmic movements, brought about by the contraction of muscles on one side of the body, lead to phase-locked changes in the excitability of homologous motor pathways of the opposite limb. The present study was conducted to assess whether these changes are influenced by the presence of visual feedback of the moving limb. Eight participants performed rhythmic flexion-extension movements of the left wrist to the beat of a metronome (1.5 Hz). In 50% of trials, visual feedback of wrist displacement was provided in relation to a target amplitude, defined by the mean movement amplitude generated during the immediately preceding no feedback trial. Motor potentials (MEPs) were evoked in the quiescent muscles of the right limb by magnetic stimulation of the left motor cortex. Consistent with our previous observations, MEP amplitudes were modulated during the movement cycle of the opposite limb. The extent of this modulation was, however, smaller in the presence of visual feedback of the moving limb (FCR omega(2) =0.41; ECR omega(2)=0.29) than in trials in which there was no visual feedback (FCR omega(2)=0.51; ECR omega(2)=0.48). In addition, the relationship between the level of FCR activation and the excitability of the homologous corticospinal pathway of the opposite limb was sensitive to the vision condition; the degree of correlation between the two variables was larger when there was no visual feedback of the moving limb. The results of the present study support the view that increases in the stability of bimanual coordination brought about by augmented feedback may be mediated by changes in the crossed modulation of excitability in homologous motor pathways.
Resumo:
The authors investigated how the intention to passively perform a behavior and the intention to persist with a behavior impact upon the spatial and temporal properties of bimanual coordination. Participants (N = 30) were asked to perform a bimanual coordination task that demanded the continuous rhythmic extension-flexion of the wrists. The frequency of movement was scaled by an auditory metronome beat from 1.5 Hz, increasing to 3.25 Hz in .25-Hz increments. The task was further defined by the requirement that the movements be performed initially in a prescribed pattern of coordination (in-phase or antiphase) while the participants assumed one of two different intentional states: stay with the prescribed pattern should it become unstable or do not intervene should the pattern begin to change. Transitions away from the initially prescribed pattern were observed only in trials conducted in the antiphase mode of coordination. The time at which the antiphase pattern of coordination became unstable was not found to be influenced by the intentional state. In addition, the do-not-intervene set led to a switch to an in-phase pattern of coordination whereas the stay set led to phase wandering. Those findings are discussed within the framework of a dynamic account of bimanual coordination.
Resumo:
Goal-directed, coordinated movements in humans emerge from a variety of constraints that range from 'high-level' cognitive strategies based oil perception of the task to 'low-level' neuromuscular-skeletal factors such as differential contributions to coordination from flexor and extensor muscles. There has been a tendency in the literature to dichotomize these sources of constraint, favouring one or the other rather than recognizing and understanding their mutual interplay. In this experiment, subjects were required to coordinate rhythmic flexion and extension movements with an auditory metronome, the rate of which was systematically increased. When subjects started in extension on the beat of the metronome, there was a small tendency to switch to flexion at higher rates, but not vice versa. When subjects: were asked to contact a physical stop, the location of which was either coincident with or counterphase to the auditor) stimulus, two effects occurred. When haptic contact was coincident with sound, coordination was stabilized for both flexion and extension. When haptic contact was counterphase to the metronome, coordination was actually destabilized, with transitions occurring from both extension to flexion on the beat and from flexion to extension on the beat. These results reveal the complementary nature of strategic and neuromuscular factors in sensorimotor coordination. They also suggest the presence of a multimodal neural integration process-which is parametrizable by rate and context - in which intentional movement, touch and sound are bound into a single, coherent unit.