922 resultados para Voice Digital Processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation explores the transformation of opera comique (as represented by the opera Carmen) and the impact of verismo style (as represented by the opera La Boheme) upon the development of operetta, American musical theater and the resultant change in vocal style. Late nineteenth-century operetta called for a classically trained soprano voice with a clear vibrato. High tessitura and legato were expected although the quality of the voice was usually lighter in timbre. The dissertation comprises four programs that explore the transformation of vocal and compositional style into the current vocal performance practice of American musical theater. The first two programs are operatic roles and the last two are recital presentations of nineteenth- and twentieth- century operetta and musical theater repertoire. Program one, Carmen, was presented on July 26, 2007 at the Marshall Performing Arts Center in Duluth, MN where I sang the role of Micaela. Program two, La Boheme, was presented on May 24,2008 at Randolph Road Theater in Silver Spring, MD where I sang the role of Musetta. Program three, presented on December 2, 2008 and program four, presented on May 10, 2009 were two recitals featuring operetta and musical theater repertoire. These programs were heard in the Gildenhorn Recital Hall at the Clarice Smith Performing Arts Center in College Park, MD. Programs one and two are documented in a digital video format available on digital video disc. Programs three and four are documented in a digital audio format available on compact disc. All programs are accompanied by program notes also available in digital format.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using scientific methods in the humanities is at the forefront of objective literary analysis. However, processing big data is particularly complex when the subject matter is qualitative rather than numerical. Large volumes of text require specialized tools to produce quantifiable data from ideas and sentiments. Our team researched the extent to which tools such as Weka and MALLET can test hypotheses about qualitative information. We examined the claim that literary commentary exists within political environments and used US periodical articles concerning Russian literature in the early twentieth century as a case study. These tools generated useful quantitative data that allowed us to run stepwise binary logistic regressions. These statistical tests allowed for time series experiments using sea change and emergency models of history, as well as classification experiments with regard to author characteristics, social issues, and sentiment expressed. Both types of experiments supported our claim with varying degrees, but more importantly served as a definitive demonstration that digitally enhanced quantitative forms of analysis can apply to qualitative data. Our findings set the foundation for further experiments in the emerging field of digital humanities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Student Experience of e-Learning Laboratory (SEEL) project at the University of Greenwich was designed to explore and then implement a number of approaches to investigate learners’ experiences of using technology to support their learning. In this paper members of the SEEL team present initial findings from a University-wide survey of nearly a 1000 students. A selection of 90 ‘cameos’, drawn from the survey data, offer further insights into personal perceptions of e-learning and illustrate the diversity of students experiences. The cameos provide a more coherent picture of individual student experience based on the totality of each person’s responses to the questionnaire. Finally, extracts from follow-up case studies, based on interviews with a small number of students, allow us to ‘hear’ the student voice more clearly. Issues arising from an analysis of the data include student preferences for communication and social networking tools, views on the ‘smartness’ of their tutors’ uses of technology and perceptions of the value of e-learning. A primary finding and the focus of this paper, is that students effectively arrive at their own individualised selection, configuration and use of technologies and software that meets their perceived needs. This ‘personalisation’ does not imply that such configurations are the most efficient, nor does it automatically suggest that effective learning is occurring. SEEL reminds us that learners are individuals, who approach learning both with and without technology in their own distinctive ways. Hearing, understanding and responding to the student voice is fundamental in maximising learning effectiveness. Institutions should consider actively developing the capacity of academic staff to advise students on the usefulness of particular online tools and resources in support of learning and consider the potential benefits of building on what students already use in their everyday lives. Given the widespread perception that students tend to be ‘digital natives’ and academic staff ‘digital immigrants’ (Prensky, 2001), this could represent a considerable cultural challenge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel application-specific instruction set processor (ASIP) for use in the construction of modern signal processing systems is presented. This is a flexible device that can be used in the construction of array processor systems for the real-time implementation of functions such as singular-value decomposition (SVD) and QR decomposition (QRD), as well as other important matrix computations. It uses a coordinate rotation digital computer (CORDIC) module to perform arithmetic operations and several approaches are adopted to achieve high performance including pipelining of the micro-rotations, the use of parallel instructions and a dual-bus architecture. In addition, a novel method for scale factor correction is presented which only needs to be applied once at the end of the computation. This also reduces computation time and enhances performance. Methods are described which allow this processor to be used in reduced dimension (i.e., folded) array processor structures that allow tradeoffs between hardware and performance. The net result is a flexible matrix computational processing element (PE) whose functionality can be changed under program control for use in a wider range of scenarios than previous work. Details are presented of the results of a design study, which considers the application of this decomposition PE architecture in a combined SVD/QRD system and demonstrates that a combination of high performance and efficient silicon implementation are achievable. © 2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the attractive features of sound synthesis by physical modeling is the potential to build acoustic-sounding digital instruments that offer more flexibility and different options in its design and control than their real-life counterparts. In order to develop such virtual-acoustic instruments, the models they are based on need to be fully parametric, i.e., all coefficients employed in the model are functions of physical parameters that are controlled either online or at the (offline) design stage. In this letter we show how propagation losses can be parametrically incorporated in digital waveguide string models with the use of zero-phase FIR filters. Starting from the simplest possible design in the form of a three-tap FIR filter, a higher-order FIR strategy is presented and discussed within the perspective of string sound synthesis with digital waveguide models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND:
tissue MicroArrays (TMAs) are a valuable platform for tissue based translational research and the discovery of tissue biomarkers. The digitised TMA slides or TMA Virtual Slides, are ultra-large digital images, and can contain several hundred samples. The processing of such slides is time-consuming, bottlenecking a potentially high throughput platform.
METHODS:
a High Performance Computing (HPC) platform for the rapid analysis of TMA virtual slides is presented in this study. Using an HP high performance cluster and a centralised dynamic load balancing approach, the simultaneous analysis of multiple tissue-cores were established. This was evaluated on Non-Small Cell Lung Cancer TMAs for complex analysis of tissue pattern and immunohistochemical positivity.
RESULTS:
the automated processing of a single TMA virtual slide containing 230 patient samples can be significantly speeded up by a factor of circa 22, bringing the analysis time to one minute. Over 90 TMAs could also be analysed simultaneously, speeding up multiplex biomarker experiments enormously.
CONCLUSIONS:
the methodologies developed in this paper provide for the first time a genuine high throughput analysis platform for TMA biomarker discovery that will significantly enhance the reliability and speed for biomarker research. This will have widespread implications in translational tissue based research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to investigate the occupational hazards within the tanning industry caused by contaminated dust. A qualitative assessment of the risk of human exposure to dust was made throughout a commercial Kenyan tannery. Using this information, high-risk points in the processing line were identified and dust sampling regimes developed. An optical set-up using microscopy and digital imaging techniques was used to determine dust particle numbers and size distributions. The results showed that chemical handling was the most hazardous (12 mg m(-3)). A Monte Carlo method was used to estimate the concentration of the dust in the air throughout the tannery during an 8 h working day. This showed that the high-risk area of the tannery was associated with mean concentrations of dust greater than the UK Statutory Instrument 2002 No. 2677. stipulated limits (exceeding 10 mg m(-3) (Inhalable dust limits) and 4 mg m(-3) (Respirable dust limits). This therefore has implications in terms of provision of personal protective equipment (PPE) to the tannery workers for the mitigation of occupational risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The highly structured nature of many digital signal processing operations allows these to be directly implemented as regular VLSI circuits. This feature has been successfully exploited in the design of a number of commercial chips, some examples of which are described. While many of the architectures on which such chips are based were originally derived on heuristic basis, there is an increasing interest in the development of systematic design techniques for the direct mapping of computations onto regular VLSI arrays. The purpose of this paper is to show how the the technique proposed by Kung can be readily extended to the design of VLSI signal processing chips where the organisation of computations at the level of individual data bits is of paramount importance. The technique in question allows architectures to be derived using the projection and retiming of data dependence graphs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of fine grain pipelining techniques in the design of high performance Wave Digital Filters (WDFs) is described. It is shown that significant increases in the sampling rate of bit parallel circuits can be achieved using most significant bit (msb) first arithmetic. A novel VLSI architecture for implementing two-port adaptor circuits is described which embodies these ideas. The circuit in question is highly regular, uses msb first arithmetic and is implemented using simple carry-save adders. © 1992 Kluwer Academic Publishers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of high-performance VLSI architectures for real-time image coding applications are described. In particular, attention is focused on circuits for computing the 2-D DCT (discrete cosine transform) and for 2-D vector quantization. The former circuits are based on Winograd algorithms and comprise a number of bit-level systolic arrays with a bit-serial, word-parallel input. The latter circuits exhibit a similar data organization and consist of a number of inner product array circuits. Both circuits are highly regular and allow extremely high data rates to be achieved through extensive use of parallelism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of fine-grain pipelining techniques in the design of high-performance wave digital filters (WDFs) is described. The problems of latency in feedback loops can be significantly reduced if computations are organized most significant, as opposed to least significant, bit first and if the results are fed back as soon as they are formed. The result is that chips can be designed which offer significantly higher sampling rates than otherwise can be obtained using conventional methods. How these concepts can be extended to the more challenging problem of WDFs is discussed. It is shown that significant increases in the sampling rate of bit-parallel circuits can be achieved using most significant bit first arithmetic.