993 resultados para Digital Presentations
Resumo:
The potential of the violoncello as a solo instrument was recognized and supported by cellists such as Luigi Boccherini (1743-1805), Luis Duport (1749-1819), Auguste Franchomme (1808-1884), and Alfredo Piatti (1822-1901). These pioneers composed technically demanding etudes, exercises, and caprices for the cello that were comparable to those already present in the violin literature. Even so, in the late nineteenth century and early twentieth century, considerably fewer substantial works were brought forth for the cello as compared with the violin. Consequently, many cellists such as Luigi Silva (1903-1961), Gregor Piatigorsky (1903-1976), Pierre Fournier (1906-1986), and Janos Starker (b. 1924) selected notable pieces from the violin repertoire and transcribed these for the cello. Some composers themselves actually adapted for the cello their own works originally written for the violin. Johannes Brahms with his Violin Sonata Op. 78, Igor Stravinsky with his Suite Italienne, and Béla Bartók with his First Rhapsody all belong to this category. Adaptations such as these further raised awareness among composers and performers of the possibilities of the cello as an independent and expressive instrument. Thus, many composers from the early 1900s to the present were encouraged to write increasing numbers of more soloistic and demanding works for cello. Herein, I explore the repertoire of cello transcriptions in order to analyze the differences between the original and transcribed versions and the challenges found therein. The performer may attempt to recreate the effect originally intended for the violin or, more daringly, may strive to search for alternate presentations of the music more suitable and expressive of the cello's own character. The project includes two recitals of the following transcribed works presented at the University of Maryland College Park, School of Music: Sonata in A by César Franck, transcribed by Jules Delsart, Variations on a Theme from Rossini by Nicolo Paganini, transcribed by Fournier, Suite Italienne by Igor Stravinsky, transcribed with the help of Piatigorsky, Sonatina Op. 137, No. 1 by Franz Schubert, transcribed by Starker, First Rhapsody by Béla Bartók and Sonata, Op. 108 by Johannes Brahms, transcribed by Hsiao-mei Sun.
Resumo:
Droplet-based digital microfluidics technology has now come of age, and software-controlled biochips for healthcare applications are starting to emerge. However, today's digital microfluidic biochips suffer from the drawback that there is no feedback to the control software from the underlying hardware platform. Due to the lack of precision inherent in biochemical experiments, errors are likely during droplet manipulation; error recovery based on the repetition of experiments leads to wastage of expensive reagents and hard-to-prepare samples. By exploiting recent advances in the integration of optical detectors (sensors) into a digital microfluidics biochip, we present a physical-aware system reconfiguration technique that uses sensor data at intermediate checkpoints to dynamically reconfigure the biochip. A cyberphysical resynthesis technique is used to recompute electrode-actuation sequences, thereby deriving new schedules, module placement, and droplet routing pathways, with minimum impact on the time-to-response. © 2012 IEEE.
Resumo:
The advent of digital microfluidic lab-on-a-chip (LoC) technology offers a platform for developing diagnostic applications with the advantages of portability, reduction of the volumes of the sample and reagents, faster analysis times, increased automation, low power consumption, compatibility with mass manufacturing, and high throughput. Moreover, digital microfluidics is being applied in other areas such as airborne chemical detection, DNA sequencing by synthesis, and tissue engineering. In most diagnostic and chemical-detection applications, a key challenge is the preparation of the analyte for presentation to the on-chip detection system. Thus, in diagnostics, raw physiological samples must be introduced onto the chip and then further processed by lysing blood cells and extracting DNA. For massively parallel DNA sequencing, sample preparation can be performed off chip, but the synthesis steps must be performed in a sequential on-chip format by automated control of buffers and nucleotides to extend the read lengths of DNA fragments. In airborne particulate-sampling applications, the sample collection from an air stream must be integrated into the LoC analytical component, which requires a collection droplet to scan an exposed impacted surface after its introduction into a closed analytical section. Finally, in tissue-engineering applications, the challenge for LoC technology is to build high-resolution (less than 10 microns) 3D tissue constructs with embedded cells and growth factors by manipulating and maintaining live cells in the chip platform. This article discusses these applications and their implementation in digital-microfluidic LoC platforms. © 2007 IEEE.
Resumo:
This dissertation addresses the growing need to entice people to attend a classical solo vocal recital by incorporating thematic programming, multi-media presentations, collaborations and innovative marketing. It comprises four programs that use the above tactics, creating live performances of classical vocal music that appeal to the attention deficient 21st-century audience. Each program focuses on repertoire appropriate for the male alto voice and includes elements of spoken word, visual imagery and for movement through collaborations with actors, singers, dancers, designers and visual artists. Program one (March 1, 2004), La Voix Humaine: The Life of an Englishwoman in Music, Poetry, & Art, outlines the life of a fictitious Englishwoman through a self-composed narration, spoken by an actress, a Power Point presentation of visual art by 20th-century English artists and musical commentary provided by the collaboration of a vocalist and a pianist. Program two (October 15, 2004), La Voix Thfrmatique: Anima - Music that Moves, is a program of pieces ranging from the 14th- to the 20th-centuries of which half are choreographed by members of the University of Maryland Dance Department. Program three is a lecture recital entitled L 'Haute Voix: Identifying the High Male Voice and Appropriate Repertoire which is presented in collaboration with three singers, a pianist, a harpsichordist and a cellist. Program four, La Voix Dramatique: Opera Roles for the Countertenor Voice, comprises performances of George Frederic Handel's Giulio Cesare in Egitto (1724) in collaboration with the Maryland Opera Studio and the Clarice Smith Performing Arts Center (Leon Major, director; Kenneth Merrill, conductor). There are two performances each of the title role, Cesare (April 15 & 17, 2005), and his nemesis, Tolomeo (April 21 & 23,2005). All programs are documented in a digital audio format available on compact disc and are accompanied by program notes also available in digital format. Programs two and four are also documented in digital video format available on digital video disc.
Resumo:
The ability to manipulate small fluid droplets, colloidal particles and single cells with the precision and parallelization of modern-day computer hardware has profound applications for biochemical detection, gene sequencing, chemical synthesis and highly parallel analysis of single cells. Drawing inspiration from general circuit theory and magnetic bubble technology, here we demonstrate a class of integrated circuits for executing sequential and parallel, timed operations on an ensemble of single particles and cells. The integrated circuits are constructed from lithographically defined, overlaid patterns of magnetic film and current lines. The magnetic patterns passively control particles similar to electrical conductors, diodes and capacitors. The current lines actively switch particles between different tracks similar to gated electrical transistors. When combined into arrays and driven by a rotating magnetic field clock, these integrated circuits have general multiplexing properties and enable the precise control of magnetizable objects.
Resumo:
This dissertation explores the transformation of opera comique (as represented by the opera Carmen) and the impact of verismo style (as represented by the opera La Boheme) upon the development of operetta, American musical theater and the resultant change in vocal style. Late nineteenth-century operetta called for a classically trained soprano voice with a clear vibrato. High tessitura and legato were expected although the quality of the voice was usually lighter in timbre. The dissertation comprises four programs that explore the transformation of vocal and compositional style into the current vocal performance practice of American musical theater. The first two programs are operatic roles and the last two are recital presentations of nineteenth- and twentieth- century operetta and musical theater repertoire. Program one, Carmen, was presented on July 26, 2007 at the Marshall Performing Arts Center in Duluth, MN where I sang the role of Micaela. Program two, La Boheme, was presented on May 24,2008 at Randolph Road Theater in Silver Spring, MD where I sang the role of Musetta. Program three, presented on December 2, 2008 and program four, presented on May 10, 2009 were two recitals featuring operetta and musical theater repertoire. These programs were heard in the Gildenhorn Recital Hall at the Clarice Smith Performing Arts Center in College Park, MD. Programs one and two are documented in a digital video format available on digital video disc. Programs three and four are documented in a digital audio format available on compact disc. All programs are accompanied by program notes also available in digital format.
Resumo:
CT and digital subtraction angiography (DSA) are ubiquitous in the clinic. Their preclinical equivalents are valuable imaging methods for studying disease models and treatment. We have developed a dual source/detector X-ray imaging system that we have used for both micro-CT and DSA studies in rodents. The control of such a complex imaging system requires substantial software development for which we use the graphical language LabVIEW (National Instruments, Austin, TX, USA). This paper focuses on a LabVIEW platform that we have developed to enable anatomical and functional imaging with micro-CT and DSA. Our LabVIEW applications integrate and control all the elements of our system including a dual source/detector X-ray system, a mechanical ventilator, a physiological monitor, and a power microinjector for the vascular delivery of X-ray contrast agents. Various applications allow cardiac- and respiratory-gated acquisitions for both DSA and micro-CT studies. Our results illustrate the application of DSA for cardiopulmonary studies and vascular imaging of the liver and coronary arteries. We also show how DSA can be used for functional imaging of the kidney. Finally, the power of 4D micro-CT imaging using both prospective and retrospective gating is shown for cardiac imaging.
Resumo:
Using scientific methods in the humanities is at the forefront of objective literary analysis. However, processing big data is particularly complex when the subject matter is qualitative rather than numerical. Large volumes of text require specialized tools to produce quantifiable data from ideas and sentiments. Our team researched the extent to which tools such as Weka and MALLET can test hypotheses about qualitative information. We examined the claim that literary commentary exists within political environments and used US periodical articles concerning Russian literature in the early twentieth century as a case study. These tools generated useful quantitative data that allowed us to run stepwise binary logistic regressions. These statistical tests allowed for time series experiments using sea change and emergency models of history, as well as classification experiments with regard to author characteristics, social issues, and sentiment expressed. Both types of experiments supported our claim with varying degrees, but more importantly served as a definitive demonstration that digitally enhanced quantitative forms of analysis can apply to qualitative data. Our findings set the foundation for further experiments in the emerging field of digital humanities.
Resumo:
Los estudiantes de enseñanza media se enfrentan al uso e interpretación de los parámetros en funciones polinomiales, lugares geométricos y expresiones algebraicas en general. Este hecho conduce a la necesidad no sólo de diferenciar los parámetros de otro tipo de literales como variables o incógnitas, sino también dar un sentido de uso a los mismos con la finalidad de agrupar los objetos matemáticos en entidades más generales como son las familias de funciones. El presente taller tiene como objetivo mostrar la influencia que puede tener el uso de un recurso tecnológico dinámico en la comprensión de esta polisemia de las literales, así como en la optimización de la ideas como puede ser la generalización.
Resumo:
O estudo é uma pesquisa-ação, na área da Informática na Educação Matemática, sobre a forma de aprender a aprender cooperativamente, segundo os Estudos Sociológicos de Piaget, no espaço de aprendizagem digital da Matemática, desenvolvida no IFRS – Osório, em 2011 e 2012, com 60 estudantes do ensino médio técnico em informática. A questão central é como analisar e compreender o processo de aprendizagem cooperativa dos conceitos de Matemática neste espaço. A definição deste espaço e de aprendizagem cooperativa é resultado desta pesquisa. Além disso, demonstra-se a construção dos conceitos de Matemática, e a mobilização dos estudantes em aprender incorporando-se as tecnologias digitais online às aulas de Matemática, sob a autonomia e responsabilidade de cada estudante e/ou de seu grupo.
Resumo:
La teoría de instrucción matemática significativa basada en el modelo ontológico -semiótico de la cognición matemática denominado Teoría de las Funciones Semióticas (TFS ) proporciona un marco unificado para el estudio de las diversas formas de conocimiento matemático y sus respectivas interacciones en el seno de los sistemas didácticos (Godino, 1998 ). Presentamos un desarrollo de esta teoría consistente en la descomposición de un objeto, para nuestro modelo, la Continuidad, en unidades para identificar entidades y las funciones semióticas que se establecen, en el proceso de enseñanza y aprendizaje en una institución escolar, implementando un ambiente de tecnología digital (calculadora graficadora TI-92 Plus y/o Voyage 200).
Resumo:
En este curso corto utilizamos distintas aplicaciones de geometría dinámica para realizar construcciones geométricas en el modelo de Poincaré para geometría hiperbólica con el propósito de investigar y determinar la naturaleza de algunos teoremas de geometría para la enseñanza secundaria y superior. De esta forma clasificamos algunos de los teoremas de geometría plana como neutrales, estrictamente euclidianas o estrictamente hiperbólicos.