6 resultados para Syntactic And Semantic Comprehension Tasks

em DRUM (Digital Repository at the University of Maryland)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over a period of 50 years—between 1962 and 2012—three preeminent American piano competitions, the Van Cliburn International Piano Competition, the University of Maryland International Piano Competition/William Kapell International Piano Competition and the San Antonio International Piano Competition, commissioned for inclusion on their required performance lists 26 piano works, almost all by American composers. These compositions, works of sufficient artistic depth and technical sophistication to serve as rigorous benchmarks for competition finalists, constitute a unique segment of the contemporary American piano repertoire. Although a limited number of these pieces have found their way into the performance repertoire of concert artists, too many have not been performed since their premières in the final rounds of the competitions for which they were designed. Such should not be the case. Some of the composers in question are innovative titans of 20th-century American music—Samuel Barber, Aaron Copland, Leonard Bernstein, John Cage, John Corigliano, William Schuman, Joan Tower and Ned Rorem, to name just a few—and many of the pieces themselves, as historical touchstones, deserve careful examination. This study includes, in addition to an introductory overview of the three competitions, a survey of all 26 compositions and an analysis of their expressive characteristics, from the point of view of the performing pianist. Numerous musical examples support the analysis. Biographical information about the composers, along with descriptions of their overall musical styles, place these pieces in historical context. Analytical and technical comprehension of this distinctive and rarely performed corner of the modern classical piano world could be of inestimable value to professional pianists, piano pedagogues and music educators alike.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relevance of explicit instruction has been well documented in SLA research. Despite numerous positive findings, however, the issue continues to engage scholars worldwide. One issue that was largely neglected in previous empirical studies - and one that may be crucial for the effectiveness of explicit instruction - is the timing and integration of rules and practice. The present study investigated the extent to which grammar explanation (GE) before practice, grammar explanation during practice, and individual differences impact the acquisition of L2 declarative and procedural knowledge of two grammatical structures in Spanish. In this experiment, 128 English-speaking learners of Spanish were randomly assigned to four experimental treatments and completed comprehension-based task-essential practice for interpreting object-verb (OV) and ser/estar (SER) sentences in Spanish. Results confirmed the predicted importance of timing of GE: participants who received GE during practice were more likely to develop and retain their knowledge successfully. Results further revealed that the various combinations of rules and practice posed differential task demands on the learners and consequently drew on language aptitude and WM to a different extent. Since these correlations between individual differences and learning outcomes were the least observed in the conditions that received GE during practice, we argue that the suitable integration of rules and practice ameliorated task demands, reducing the burden on the learner, and accordingly mitigated the role of participants’ individual differences. Finally, some evidence also showed that the comprehension practice that participants received for the two structures was not sufficient for the formation of solid productive knowledge, but was more effective for the OV than for the SER construction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compressed covariance sensing using quadratic samplers is gaining increasing interest in recent literature. Covariance matrix often plays the role of a sufficient statistic in many signal and information processing tasks. However, owing to the large dimension of the data, it may become necessary to obtain a compressed sketch of the high dimensional covariance matrix to reduce the associated storage and communication costs. Nested sampling has been proposed in the past as an efficient sub-Nyquist sampling strategy that enables perfect reconstruction of the autocorrelation sequence of Wide-Sense Stationary (WSS) signals, as though it was sampled at the Nyquist rate. The key idea behind nested sampling is to exploit properties of the difference set that naturally arises in quadratic measurement model associated with covariance compression. In this thesis, we will focus on developing novel versions of nested sampling for low rank Toeplitz covariance estimation, and phase retrieval, where the latter problem finds many applications in high resolution optical imaging, X-ray crystallography and molecular imaging. The problem of low rank compressive Toeplitz covariance estimation is first shown to be fundamentally related to that of line spectrum recovery. In absence if noise, this connection can be exploited to develop a particular kind of sampler called the Generalized Nested Sampler (GNS), that can achieve optimal compression rates. In presence of bounded noise, we develop a regularization-free algorithm that provably leads to stable recovery of the high dimensional Toeplitz matrix from its order-wise minimal sketch acquired using a GNS. Contrary to existing TV-norm and nuclear norm based reconstruction algorithms, our technique does not use any tuning parameters, which can be of great practical value. The idea of nested sampling idea also finds a surprising use in the problem of phase retrieval, which has been of great interest in recent times for its convex formulation via PhaseLift, By using another modified version of nested sampling, namely the Partial Nested Fourier Sampler (PNFS), we show that with probability one, it is possible to achieve a certain conjectured lower bound on the necessary measurement size. Moreover, for sparse data, an l1 minimization based algorithm is proposed that can lead to stable phase retrieval using order-wise minimal number of measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While news stories are an important traditional medium to broadcast and consume news, microblogging has recently emerged as a place where people can dis- cuss, disseminate, collect or report information about news. However, the massive information in the microblogosphere makes it hard for readers to keep up with these real-time updates. This is especially a problem when it comes to breaking news, where people are more eager to know “what is happening”. Therefore, this dis- sertation is intended as an exploratory effort to investigate computational methods to augment human effort when monitoring the development of breaking news on a given topic from a microblog stream by extractively summarizing the updates in a timely manner. More specifically, given an interest in a topic, either entered as a query or presented as an initial news report, a microblog temporal summarization system is proposed to filter microblog posts from a stream with three primary concerns: topical relevance, novelty, and salience. Considering the relatively high arrival rate of microblog streams, a cascade framework consisting of three stages is proposed to progressively reduce quantity of posts. For each step in the cascade, this dissertation studies methods that improve over current baselines. In the relevance filtering stage, query and document expansion techniques are applied to mitigate sparsity and vocabulary mismatch issues. The use of word embedding as a basis for filtering is also explored, using unsupervised and supervised modeling to characterize lexical and semantic similarity. In the novelty filtering stage, several statistical ways of characterizing novelty are investigated and ensemble learning techniques are used to integrate results from these diverse techniques. These results are compared with a baseline clustering approach using both standard and delay-discounted measures. In the salience filtering stage, because of the real-time prediction requirement a method of learning verb phrase usage from past relevant news reports is used in conjunction with some standard measures for characterizing writing quality. Following a Cranfield-like evaluation paradigm, this dissertation includes a se- ries of experiments to evaluate the proposed methods for each step, and for the end- to-end system. New microblog novelty and salience judgments are created, building on existing relevance judgments from the TREC Microblog track. The results point to future research directions at the intersection of social media, computational jour- nalism, information retrieval, automatic summarization, and machine learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last two decades have seen many exciting examples of tiny robots from a few cm3 to less than one cm3. Although individually limited, a large group of these robots has the potential to work cooperatively and accomplish complex tasks. Two examples from nature that exhibit this type of cooperation are ant and bee colonies. They have the potential to assist in applications like search and rescue, military scouting, infrastructure and equipment monitoring, nano-manufacture, and possibly medicine. Most of these applications require the high level of autonomy that has been demonstrated by large robotic platforms, such as the iRobot and Honda ASIMO. However, when robot size shrinks down, current approaches to achieve the necessary functions are no longer valid. This work focused on challenges associated with the electronics and fabrication. We addressed three major technical hurdles inherent to current approaches: 1) difficulty of compact integration; 2) need for real-time and power-efficient computations; 3) unavailability of commercial tiny actuators and motion mechanisms. The aim of this work was to provide enabling hardware technologies to achieve autonomy in tiny robots. We proposed a decentralized application-specific integrated circuit (ASIC) where each component is responsible for its own operation and autonomy to the greatest extent possible. The ASIC consists of electronics modules for the fundamental functions required to fulfill the desired autonomy: actuation, control, power supply, and sensing. The actuators and mechanisms could potentially be post-fabricated on the ASIC directly. This design makes for a modular architecture. The following components were shown to work in physical implementations or simulations: 1) a tunable motion controller for ultralow frequency actuation; 2) a nonvolatile memory and programming circuit to achieve automatic and one-time programming; 3) a high-voltage circuit with the highest reported breakdown voltage in standard 0.5 μm CMOS; 4) thermal actuators fabricated using CMOS compatible process; 5) a low-power mixed-signal computational architecture for robotic dynamics simulator; 6) a frequency-boost technique to achieve low jitter in ring oscillators. These contributions will be generally enabling for other systems with strict size and power constraints such as wireless sensor nodes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

(Deep) neural networks are increasingly being used for various computer vision and pattern recognition tasks due to their strong ability to learn highly discriminative features. However, quantitative analysis of their classication ability and design philosophies are still nebulous. In this work, we use information theory to analyze the concatenated restricted Boltzmann machines (RBMs) and propose a mutual information-based RBM neural networks (MI-RBM). We develop a novel pretraining algorithm to maximize the mutual information between RBMs. Extensive experimental results on various classication tasks show the eectiveness of the proposed approach.