980 resultados para Calloway, Chris


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The solo trombone recital was once a rare musical event, but in recent years professional and amateur trombonists frequently present solo performances. The trombone has been around since the latter half of the 15th century and there is a wealth of ensemble repertoire, written for the instrument; however, there is no corresponding corpus of solo works. A small body of solo works does exist, from baroque sonatas and the alto trombone concertos of Leopold Mozart and Georg Wagenseil, to the romantic works by Ferdinand David and Nicolai Rimsky-Korsakov. This repertoire is small in number and a modern trombonist often has to resort to orchestral reductions and arrangements for modern performance in a solo recital setting. The trombone came into its own as a solo instrument in the 20th century and it is in this era where the bulk of a modern trombonist's repertoire resides. While there is now no shortage of music to choose from, presenting a diverse, yet musically cohesive recital remains a challenge though many interesting musical opportunities can arise to meet this challenge. While the piano is an extremely versatile instrument, pairing trombone with percussion opens up possibilities that are absent from the more traditional piano pairing. Percussion instruments can offer an almost unlimited variation of timbre and dynamics to complement the trombone. Dynamic range of the trombone must be considered as the instrument has the ability to play at the extremes of the dynamic range. Percussion instruments can match the trombone in these extremes. When presenting a recital of 20th and 21st century music, using timbre and dynamic range as selection criteria when planning the program are effective ways to bring a unique and intense musical experience to the audience. In this paper, the two considerations of dynamics and timbre will be explored and the need for a dissertation recital project will be explained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The trumpet experienced important changes in terms of its musical use during the middle and late Baroque period. Prior to the Baroque, and even in to the first half of the 17th century, the trumpet had historically been used for rather "non-musical" purposes, sometimes as an instrument for battle or as a tool to be used in the town square to announce the arrival of a dignitary. On the whole, the trumpet was most certainly not used as an instrument of melody -that was typically reserved for violins, flutes, and oboes. However, in the late 1600's, composers such as Allesandro Stradella and Henry Purcell began to treat the trumpet differently. They saw the melodic potential in the trumpet and began to feature the trumpet more as an instrument of melody, as opposed to relegating it to only outlining triads and emphasizing harmony. Of course, keyboard, string, and woodwind instruments had long established a significant catalogue of works by the late 17th century. Additionally, even after the trumpet had been established as an instrument of melody, prominent composers of the time still wrote significantly more solo music for these other instrument families than for the trumpet. Consequently, the overall Baroque repertoire for the solo trumpet pales in comparison to that of the other families of instruments. But, much of this Baroque literature not originally written for trumpet can be presented effectively in the form of a transcription, thereby adding greatly to the repertoire of the Baroque solo trumpet. The goal of these three dissertation recitals is twofold: 1) to perform literature that offers music from a variety of countries of origin that span the entire Baroque era and 2) to feature music that has remained relatively unknown in the trumpet world, yet is musically strong. I will also introduce viable "new" music to the trumpet repertoire through Baroque transcriptions originally written for other instruments or voice. The majority of the transcriptions I will be performing have originated from my own listening and study of Baroque music, and I have selected music that I felt would translate well for the trumpet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A very good case can be made that no other instrument has experienced as dramatic an increase in artistic solo repertoire as the tuba in the past sixty years. Prior to 1954, the mainstays of the tuba repertoire were trite caricature pieces such as Solo Pomposo, Rocked in the Cradle of the Deep, Beelzebub, and Bombastoso. A few tubists, seeing the tremendous repertoire by great composers written for their brass brethren, took it upon themselves to raise the standard of original compositions for tuba. These pioneers and champions of the tuba accomplished a great deal in the mid to late twentieth century. They structured a professional organization to solidify their ranks, planned and performed in the first tuba recitals at Carnegie Hall, organized the First International Tuba Symposium-Workshop, indirectly created more prestigious positions for tuba specialists at major universities, and improved the quantity and quality of the solo tuba repertoire. This dissertation focuses on the development of the solo repertoire for tuba that happened in the United States because of the tremendous efforts of William Bell, Harvey Phillips, Roger Bobo, and R. Winston Morris. Because of their tireless work, tuba instrumentalists today enjoy a multitude of great solo works including traditional sonatas, concertos, and chamber music as well as cutting edge repertoire written in many genres and accompanied by a variety of mediums. This dissertation attempts to trace the development of the repertoire presenting the works of American composers in varying genres and musical styles from 1962 to present through three performed recitals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

College students receive a wealth of information through electronic communications that they are unable to process efficiently. This information overload negatively impacts their affect, which is officially defined in the field of psychology as the experience of feeling or emotion. To address this problem, we postulated that we could create an application that organizes and presents incoming content in a manner that optimizes users’ ability to process information. First, we conducted surveys that quantitatively measured each participant’s psychological affect while handling electronic communications, which was used to tailor the features of the application to what the user’s desire. After designing and implementing the application, we again measured the user's affect using this product. Our goal was to find that the program promoted a positive change in affect. Our application, Brevitus, was able to match Gmail on affect reduction profiles, while succeeding in implementing certain user interface specifications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Behavioral Parent Training (BPT) is a well-established therapy that reduces child externalized behaviors and parent stress. Although BPT was originally developed for parents of children with defiant behaviors, the program’s key concepts are relevant to parenting all children. Since parents might not fully utilize BPT due to cost and program location, we created an online game as a low-cost, easily accessible alternative or complement to BPT. We tested the game with nineteen undergraduate students at the University of Maryland. The experimental group completed pretest survey on core BPT knowledge, played the game, and completed a BPT posttest, while the control group completed a pretest and posttest survey over a three week period. Participants in the experimental group also completed a survey to indicate their satisfaction with the overall program. The experimental group demonstrated significantly higher levels of BPT knowledge than the control group and high levels of satisfaction. This suggests that an interactive, online BPT platform is an engaging and accessible way for parents to learn key concepts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Income inequality undermines societies: The more inequality, the more health problems, social tensions, and the lower social mobility, trust, life expectancy. Given people's tendency to legitimate existing social arrangements, the stereotype content model (SCM) argues that ambivalence-perceiving many groups as either warm or competent, but not both-may help maintain socio-economic disparities. The association between stereotype ambivalence and income inequality in 37 cross-national samples from Europe, the Americas, Oceania, Asia, and Africa investigates how groups' overall warmth-competence, status-competence, and competition-warmth correlations vary across societies, and whether these variations associate with income inequality (Gini index). More unequal societies report more ambivalent stereotypes, whereas more equal ones dislike competitive groups and do not necessarily respect them as competent. Unequal societies may need ambivalence for system stability: Income inequality compensates groups with partially positive social images. © 2012 The British Psychological Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper considers the problem of sequencing n jobs in a three-machine flow shop with the objective of minimizing the makespan, which is the completion time of the last job. An O(n log n) time heuristic that is based on Johnson's algorithm is presented. It is shown to generate a schedule with length at most 5/3 times that of an optimal schedule, thereby reducing the previous best available worst-case performance ratio of 2. An application to the general flow shop is also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In many practical situations, batching of similar jobs to avoid setups is performed while constructing a schedule. This paper addresses the problem of non-preemptively scheduling independent jobs in a two-machine flow shop with the objective of minimizing the makespan. Jobs are grouped into batches. A sequence independent batch setup time on each machine is required before the first job is processed, and when a machine switches from processing a job in some batch to a job of another batch. Besides its practical interest, this problem is a direct generalization of the classical two-machine flow shop problem with no grouping of jobs, which can be solved optimally by Johnson's well-known algorithm. The problem under investigation is known to be NP-hard. We propose two O(n logn) time heuristic algorithms. The first heuristic, which creates a schedule with minimum total setup time by forcing all jobs in the same batch to be sequenced in adjacent positions, has a worst-case performance ratio of 3/2. By allowing each batch to be split into at most two sub-batches, a second heuristic is developed which has an improved worst-case performance ratio of 4/3. © 1998 The Mathematical Programming Society, Inc. Published by Elsevier Science B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solder materials are used to provide a connection between electronic components and printed circuit boards (PCBs) using either the reflow or wave soldering process. As a board assembly passes through a reflow furnace the solder (initially in the form of solder paste) melts, reflows, then solidifies, and finally deforms between the chip and board. A number of defects may occur during this process such as flux entrapment, void formation, and cracking of the joint, chip or board. These defects are a serious concern to industry, especially with trends towards increasing component miniaturisation and smaller pitch sizes. This paper presents a modelling methodology for predicting solder joint shape, solidification, and deformation (stress) during the assembly process.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Realizing scalable performance on high performance computing systems is not straightforward for single-phenomenon codes (such as computational fluid dynamics [CFD]). This task is magnified considerably when the target software involves the interactions of a range of phenomena that have distinctive solution procedures involving different discretization methods. The problems of addressing the key issues of retaining data integrity and the ordering of the calculation procedures are significant. A strategy for parallelizing this multiphysics family of codes is described for software exploiting finite-volume discretization methods on unstructured meshes using iterative solution procedures. A mesh partitioning-based SPMD approach is used. However, since different variables use distinct discretization schemes, this means that distinct partitions are required; techniques for addressing this issue are described using the mesh-partitioning tool, JOSTLE. In this contribution, the strategy is tested for a variety of test cases under a wide range of conditions (e.g., problem size, number of processors, asynchronous / synchronous communications, etc.) using a variety of strategies for mapping the mesh partition onto the processor topology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The future of many companies will depend to a large extent on their ability to initiate techniques that bring schedules, performance, tests, support, production, life-cycle-costs, reliability prediction and quality control into the earliest stages of the product creation process. Important questions for an engineer who is responsible for the quality of electronic parts such as printed circuit boards (PCBs) during design, production, assembly and after-sales support are: What is the impact of temperature? What is the impact of this temperature on the stress produced in the components? What is the electromagnetic compatibility (EMC) associated with such a design? At present, thermal, stress and EMC calculations are undertaken using different software tools that each require model build and meshing. This leads to a large investment in time, and hence cost, to undertake each of these simulations. This paper discusses the progression towards a fully integrated software environment, based on a common data model and user interface, having the capability to predict temperature, stress and EMC fields in a coupled manner. Such a modelling environment used early within the design stage of an electronic product will provide engineers with fast solutions to questions regarding thermal, stress and EMC issues. The paper concentrates on recent developments in creating such an integrated modeling environment with preliminary results from the analyses conducted. Further research into the thermal and stress related aspects of the paper is being conducted under a nationally funded project, while their application in reliability prediction will be addressed in a new European project called PROFIT.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a dynamic distributed load balancing algorithm for parallel, adaptive Finite Element simulations in which we use preconditioned Conjugate Gradient solvers based on domain-decomposition. The load balancing is designed to maintain good partition aspect ratio and we show that cut size is not always the appropriate measure in load balancing. Furthermore, we attempt to answer the question why the aspect ratio of partitions plays an important role for certain solvers. We define and rate different kinds of aspect ratio and present a new center-based partitioning method of calculating the initial distribution which implicitly optimizes this measure. During the adaptive simulation, the load balancer calculates a balancing flow using different versions of the diffusion algorithm and a variant of breadth first search. Elements to be migrated are chosen according to a cost function aiming at the optimization of subdomain shapes. Experimental results for Bramble's preconditioner and comparisons to state-of-the-art load balancers show the benefits of the construction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A comprehensive simulation of solidification/melting processes requires the simultaneous representation of free surface fluid flow, heat transfer, phase change, non-linear solid mechanics and, possibly, electromagnetics together with their interactions in what is now referred to as "multi-physics" simulation. A 3D computational procedure and software tool, PHYSICA, embedding the above multi-physics models using finite volume methods on unstructured meshes (FV-UM) has been developed. Multi-physics simulations are extremely compute intensive and a strategy to parallelise such codes has, therefore, been developed. This strategy has been applied to PHYSICA and evaluated on a range of challenging multi-physics problems drawn from actual industrial cases.