950 resultados para Greedy String Tiling


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Horace's last Satire describes a disastrous dinner party hosted by the gourmet Nasidienus, which is ruined by a collapsing tapestry. The food served afterwards is presented in a dismembered state. This chapter argues that several elements of the scene recall the greedy Harpies of Apollonius' Argonautica, and that Horace's friend Virgil shows the influence of this Satire in his own Harpy-scene in Aeneid 3. It also argues that the confusion in the middle of the dinner causes the food cooking in the kitchen to be neglected and burned. This explains the state of the subsequent courses, which Nasidienus has salvaged from a separate disaster backstage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The artificial grammar (AG) learning literature (see, e.g., Mathews et al., 1989; Reber, 1967) has relied heavily on a single measure of implicitly acquired knowledge. Recent work comparing this measure (string classification) with a more indirect measure in which participants make liking ratings of novel stimuli (e.g., Manza & Bornstein, 1995; Newell & Bright, 2001) has shown that string classification (which we argue can be thought of as an explicit, rather than an implicit, measure of memory) gives rise to more explicit knowledge of the grammatical structure in learning strings and is more resilient to changes in surface features and processing between encoding and retrieval. We report data from two experiments that extend these findings. In Experiment 1, we showed that a divided attention manipulation (at retrieval) interfered with explicit retrieval of AG knowledge but did not interfere with implicit retrieval. In Experiment 2, we showed that forcing participants to respond within a very tight deadline resulted in the same asymmetric interference pattern between the tasks. In both experiments, we also showed that the type of information being retrieved influenced whether interference was observed. The results are discussed in terms of the relatively automatic nature of implicit retrieval and also with respect to the differences between analytic and nonanalytic processing (Whittlesea Price, 2001).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A greedy technique is proposed to construct parsimonious kernel classifiers using the orthogonal forward selection method and boosting based on Fisher ratio for class separability measure. Unlike most kernel classification methods, which restrict kernel means to the training input data and use a fixed common variance for all the kernel terms, the proposed technique can tune both the mean vector and diagonal covariance matrix of individual kernel by incrementally maximizing Fisher ratio for class separability measure. An efficient weighted optimization method is developed based on boosting to append kernels one by one in an orthogonal forward selection procedure. Experimental results obtained using this construction technique demonstrate that it offers a viable alternative to the existing state-of-the-art kernel modeling methods for constructing sparse Gaussian radial basis function network classifiers. that generalize well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Rodney Brooks has been called the “Self Styled Bad Boy of Robotics”. In the 1990s he gained this dubious honour by orchestrating a string of highly evocative robots from his artificial interligence Labs at the Massachusettes Institute of Technology (MIT), Boston, USA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Embodied theories of cognition propose that neural substrates used in experiencing the referent of a word, for example perceiving upward motion, should be engaged in weaker form when that word, for example ‘rise’, is comprehended. Motivated by the finding that the perception of irrelevant background motion at near-threshold, but not supra-threshold, levels interferes with task execution, we assessed whether interference from near-threshold background motion was modulated by its congruence with the meaning of words (semantic content) when participants completed a lexical decision task (deciding if a string of letters is a real word or not). Reaction times for motion words, such as ‘rise’ or ‘fall’, were slower when the direction of visual motion and the ‘motion’ of the word were incongruent — but only when the visual motion was at nearthreshold levels. When motion was supra-threshold, the distribution of error rates, not reaction times, implicated low-level motion processing in the semantic processing of motion words. As the perception of near-threshold signals is not likely to be influenced by strategies, our results support a close contact between semantic information and perceptual systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It has been shown through a number of experiments that neural networks can be used for a phonetic typewriter. Algorithms can be looked on as producing self-organizing feature maps which correspond to phonemes. In the Chinese language the utterance of a Chinese character consists of a very simple string of Chinese phonemes. With this as a starting point, a neural network feature map for Chinese phonemes can be built up. In this paper, feature map structures for Chinese phonemes are discussed and tested. This research on a Chinese phonetic feature map is important both for Chinese speech recognition and for building a Chinese phonetic typewriter.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Stochastic Diffusion Search (SDS) was developed as a solution to the best-fit search problem. Thus, as a special case it is capable of solving the transform invariant pattern recognition problem. SDS is efficient and, although inherently probabilistic, produces very reliable solutions in widely ranging search conditions. However, to date a systematic formal investigation of its properties has not been carried out. This thesis addresses this problem. The thesis reports results pertaining to the global convergence of SDS as well as characterising its time complexity. However, the main emphasis of the work, reports on the resource allocation aspect of the Stochastic Diffusion Search operations. The thesis introduces a novel model of the algorithm, generalising an Ehrenfest Urn Model from statistical physics. This approach makes it possible to obtain a thorough characterisation of the response of the algorithm in terms of the parameters describing the search conditions in case of a unique best-fit pattern in the search space. This model is further generalised in order to account for different search conditions: two solutions in the search space and search for a unique solution in a noisy search space. Also an approximate solution in the case of two alternative solutions is proposed and compared with predictions of the extended Ehrenfest Urn model. The analysis performed enabled a quantitative characterisation of the Stochastic Diffusion Search in terms of exploration and exploitation of the search space. It appeared that SDS is biased towards the latter mode of operation. This novel perspective on the Stochastic Diffusion Search lead to an investigation of extensions of the standard SDS, which would strike a different balance between these two modes of search space processing. Thus, two novel algorithms were derived from the standard Stochastic Diffusion Search, ‘context-free’ and ‘context-sensitive’ SDS, and their properties were analysed with respect to resource allocation. It appeared that they shared some of the desired features of their predecessor but also possessed some properties not present in the classic SDS. The theory developed in the thesis was illustrated throughout with carefully chosen simulations of a best-fit search for a string pattern, a simple but representative domain, enabling careful control of search conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Proteomics approaches have made important contributions to the characterisation of platelet regulatory mechanisms. A common problem encountered with this method, however, is the masking of low-abundance (e.g. signalling) proteins in complex mixtures by highly abundant proteins. In this study, subcellular fractionation of washed human platelets either inactivated or stimulated with the glycoprotein (GP) VI collagen receptor agonist, collagen-related peptide, reduced the complexity of the platelet proteome. The majority of proteins identified by tandem mass spectrometry are involved in signalling. The effect of GPVI stimulation on levels of specific proteins in subcellular compartments was compared and analysed using in silico quantification, and protein associations were predicted using STRING (the search tool for recurring instances of neighbouring genes/proteins). Interestingly, we observed that some proteins that were previously unidentified in platelets including teneurin-1 and Van Gogh-like protein 1, translocated to the membrane upon GPVI stimulation. Newly identified proteins may be involved in GPVI signalling nodes of importance for haemostasis and thrombosis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evolutionary meta-algorithms for pulse shaping of broadband femtosecond duration laser pulses are proposed. The genetic algorithm searching the evolutionary landscape for desired pulse shapes consists of a population of waveforms (genes), each made from two concatenated vectors, specifying phases and magnitudes, respectively, over a range of frequencies. Frequency domain operators such as mutation, two-point crossover average crossover, polynomial phase mutation, creep and three-point smoothing as well as a time-domain crossover are combined to produce fitter offsprings at each iteration step. The algorithm applies roulette wheel selection; elitists and linear fitness scaling to the gene population. A differential evolution (DE) operator that provides a source of directed mutation and new wavelet operators are proposed. Using properly tuned parameters for DE, the meta-algorithm is used to solve a waveform matching problem. Tuning allows either a greedy directed search near the best known solution or a robust search across the entire parameter space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The European summer of 2012 was marked by strongly contrasting rainfall anomalies, which led to flooding in northern Europe and droughts and wildfires in southern Europe. This season was not an isolated event, rather the latest in a string of summers characterized by a southward shifted Atlantic storm track as described by the negative phase of the SNAO. The degree of decadal variability in these features suggests a role for forcing from outside the dynamical atmosphere, and preliminary numerical experiments suggest that the global SST and low Arctic sea ice extent anomalies are likely to have played a role and that warm North Atlantic SSTs were a particular contributing factor. The direct effects of changes in radiative forcing from greenhouse gas and aerosol forcing are not included in these experiments, but both anthropogenic forcing and natural variability may have influenced the SST and sea ice changes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[1] A method is presented to calculate the continuum-scale sea ice stress as an imposed, continuum-scale strain-rate is varied. The continuum-scale stress is calculated as the area-average of the stresses within the floes and leads in a region (the continuum element). The continuum-scale stress depends upon: the imposed strain rate; the subcontinuum scale, material rheology of sea ice; the chosen configuration of sea ice floes and leads; and a prescribed rule for determining the motion of the floes in response to the continuum-scale strain-rate. We calculated plastic yield curves and flow rules associated with subcontinuum scale, material sea ice rheologies with elliptic, linear and modified Coulombic elliptic plastic yield curves, and with square, diamond and irregular, convex polygon-shaped floes. For the case of a tiling of square floes, only for particular orientations of the leads have the principal axes of strain rate and calculated continuum-scale sea ice stress aligned, and these have been investigated analytically. The ensemble average of calculated sea ice stress for square floes with uniform orientation with respect to the principal axes of strain rate yielded alignment of average stress and strain-rate principal axes and an isotropic, continuum-scale sea ice rheology. We present a lemon-shaped yield curve with normal flow rule, derived from ensemble averages of sea ice stress, suitable for direct inclusion into the current generation of sea ice models. This continuum-scale sea ice rheology directly relates the size (strength) of the continuum-scale yield curve to the material compressive strength.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The dependence of the annual mean tropical precipitation on horizontal resolution is investigated in the atmospheric version of the Hadley Centre General Environment Model (HadGEM1). Reducing the grid spacing from about 350 km to 110 km improves the precipitation distribution in most of the tropics. In particular, characteristic dry biases over South and Southeast Asia including the Maritime Continent as well as wet biases over the western tropical oceans are reduced. The annual-mean precipitation bias is reduced by about one third over the Maritime Continent and the neighbouring ocean basins associated with it via the Walker circulation. Sensitivity experiments show that much of the improvement with resolution in the Maritime Continent region is due to the specification of better resolved surface boundary conditions (land fraction, soil and vegetation parameters) at the higher resolution. It is shown that in particular the formulation of the coastal tiling scheme may cause resolution sensitivity of the mean simulated climate. The improvement in the tropical mean precipitation in this region is not primarily associated with the better representation of orography at the higher resolution, nor with changes in the eddy transport of moisture. Sizeable sensitivity to changes in the surface fields may be one of the reasons for the large variation of the mean tropical precipitation distribution seen across climate models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Learning low dimensional manifold from highly nonlinear data of high dimensionality has become increasingly important for discovering intrinsic representation that can be utilized for data visualization and preprocessing. The autoencoder is a powerful dimensionality reduction technique based on minimizing reconstruction error, and it has regained popularity because it has been efficiently used for greedy pretraining of deep neural networks. Compared to Neural Network (NN), the superiority of Gaussian Process (GP) has been shown in model inference, optimization and performance. GP has been successfully applied in nonlinear Dimensionality Reduction (DR) algorithms, such as Gaussian Process Latent Variable Model (GPLVM). In this paper we propose the Gaussian Processes Autoencoder Model (GPAM) for dimensionality reduction by extending the classic NN based autoencoder to GP based autoencoder. More interestingly, the novel model can also be viewed as back constrained GPLVM (BC-GPLVM) where the back constraint smooth function is represented by a GP. Experiments verify the performance of the newly proposed model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This exploratory study is concerned with the performance of Egyptian children with Down syndrome on counting and error detection tasks and investigates how these children acquire counting. Observations and interviews were carried out to collect further information about their performance in a class context. Qualitative and quantitative analysis suggested a notable deficit in counting in Egyptian children with Down syndrome with none of the children able to recite the number string up to ten or count a set of five objects correctly. They performed less well on tasks which added more load on memory. The tentative finding of this exploratory study supported previous research findings that children with Down syndrome acquire counting by rote and links this with their learning experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Duplication at the Xq28 band including the MECP2 gene is one of the most common genomic rearrangements identified in neurodevelopmentally delayed males. Such duplications are non-recurrent and can be generated by a non-homologous end joining (NHEJ) mechanism. We investigated the potential mechanisms for MECP2 duplication and examined whether genomic architectural features may play a role in their origin using a custom designed 4-Mb tiling-path oligonucleotide array CGH assay. Each of the 30 patients analyzed showed a unique duplication varying in size from similar to 250 kb to similar to 2.6 Mb. Interestingly, in 77% of these non-recurrent duplications, the distal breakpoints grouped within a 215 kb genomic interval, located 47 kb telomeric to the MECP2 gene. The genomic architecture of this region contains both direct and inverted low-copy repeat (LCR) sequences; this same region undergoes polymorphic structural variation in the general population. Array CGH revealed complex rearrangements in eight patients; in six patients the duplication contained an embedded triplicated segment, and in the other two, stretches of non-duplicated sequences occurred within the duplicated region. Breakpoint junction sequencing was achieved in four duplications and identified an inversion in one patient, demonstrating further complexity. We propose that the presence of LCRs in the vicinity of the MECP2 gene may generate an unstable DNA structure that can induce DNA strand lesions, such as a collapsed fork, and facilitate a Fork Stalling and Template Switching event producing the complex rearrangements involving MECP2.