116 resultados para PRECISION EXPERIMENTS
em Cambridge University Engineering Department Publications Database
Resumo:
Structured precision modelling is an important approach to improve the intra-frame correlation modelling of the standard HMM, where Gaussian mixture model with diagonal covariance are used. Previous work has all been focused on direct structured representation of the precision matrices. In this paper, a new framework is proposed, where the structure of the Cholesky square root of the precision matrix is investigated, referred to as Cholesky Basis Superposition (CBS). Each Cholesky matrix associated with a particular Gaussian distribution is represented as a linear combination of a set of Gaussian independent basis upper-triangular matrices. Efficient optimization methods are derived for both combination weights and basis matrices. Experiments on a Chinese dictation task showed that the proposed approach can significantly outperformed the direct structured precision modelling with similar number of parameters as well as full covariance modelling. © 2011 IEEE.
Resumo:
The brain encodes visual information with limited precision. Contradictory evidence exists as to whether the precision with which an item is encoded depends on the number of stimuli in a display (set size). Some studies have found evidence that precision decreases with set size, but others have reported constant precision. These groups of studies differed in two ways. The studies that reported a decrease used displays with heterogeneous stimuli and tasks with a short-term memory component, while the ones that reported constancy used homogeneous stimuli and tasks that did not require short-term memory. To disentangle the effects of heterogeneity and short-memory involvement, we conducted two main experiments. In Experiment 1, stimuli were heterogeneous, and we compared a condition in which target identity was revealed before the stimulus display with one in which it was revealed afterward. In Experiment 2, target identity was fixed, and we compared heterogeneous and homogeneous distractor conditions. In both experiments, we compared an optimal-observer model in which precision is constant with set size with one in which it depends on set size. We found that precision decreases with set size when the distractors are heterogeneous, regardless of whether short-term memory is involved, but not when it is homogeneous. This suggests that heterogeneity, not short-term memory, is the critical factor. In addition, we found that precision exhibits variability across items and trials, which may partly be caused by attentional fluctuations.
Resumo:
Looking for a target in a visual scene becomes more difficult as the number of stimuli increases. In a signal detection theory view, this is due to the cumulative effect of noise in the encoding of the distractors, and potentially on top of that, to an increase of the noise (i.e., a decrease of precision) per stimulus with set size, reflecting divided attention. It has long been argued that human visual search behavior can be accounted for by the first factor alone. While such an account seems to be adequate for search tasks in which all distractors have the same, known feature value (i.e., are maximally predictable), we recently found a clear effect of set size on encoding precision when distractors are drawn from a uniform distribution (i.e., when they are maximally unpredictable). Here we interpolate between these two extreme cases to examine which of both conclusions holds more generally as distractor statistics are varied. In one experiment, we vary the level of distractor heterogeneity; in another we dissociate distractor homogeneity from predictability. In all conditions in both experiments, we found a strong decrease of precision with increasing set size, suggesting that precision being independent of set size is the exception rather than the rule.
Resumo:
Experimental and computational studies on the dynamics of millimeter-scale cylindrical liquid jets are presented. The influences of the modulation amplitude and the nozzle geometry on jet behavior have been considered. Laser Doppler anemometry (LDA) was used in order to extract the velocity field of a jet along its length, and to determine the velocity modulation amplitude. Jet shapes and breakup dynamics were observed via shadowgraph imaging. Aqueous solutions of glycerol were used for these experiments. Results were compared with Lagrangian finite-element simulations with good quantitative agreement. © 2011 The American Physical Society.
Resumo:
The creation and evolution of millimeter-sized droplets of a Newtonian liquid generated on demand by the action of pressure pulses were studied experimentally and simulated numerically. The velocity response within a model, large-scale printhead was recorded by laser Doppler anemometry, and the waveform was used in Lagrangian finite-element simulations as an input. Droplet shapes and positions were observed by shadowgraphy and compared with their numerically obtained analogues. © 2011 American Physical Society.
Resumo:
A pin on cylinder wear rig has been built with precision stepping motor drives to both rotary and axial motions which enable accurate positional control to be achieved. Initial experiments using sapphire indenters running against copper substrates have investigated the build up of a single wear groove by repeated sliding along the same track. An approximate three dimensional ploughing analysis is also presented and the results of theory and experiment compared.
Resumo:
Using fluorescence microscopy with single molecule sensitivity it is now possible to follow the movement of individual fluorophore tagged molecules such as proteins and lipids in the cell membrane with nanometer precision. These experiments are important as they allow many key biological processes on the cell membrane and in the cell, such as transcription, translation and DNA replication, to be studied at new levels of detail. Computerized microscopes generate sequences of images (in the order of tens to hundreds) of the molecules diffusing and one of the challenges is to track these molecules to obtain reliable statistics such as speed distributions, diffusion patterns, intracellular positioning, etc. The data set is challenging because the molecules are tagged with a single or small number of fluorophores, which makes it difficult to distinguish them from the background, the fluorophore bleaches irreversibly over time, the number of tagged molecules are unknown and there is occasional loss of signal from the tagged molecules. All these factors make accurate tracking over long trajectories difficult. Also the experiments are technically difficulty to conduct and thus there is a pressing need to develop better algorithms to extract the maximum information from the data. For this purpose we propose a Bayesian approach and apply our technique to synthetic and a real experimental data set.