3 resultados para Almost Hypercomplex Manifold

em Aston University Research Archive


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Curriculum innovation is challenging and, as several commentators have reported, moves to introduce communicative language teaching in many contexts internationally have resulted in mixed outcomes, or even failure. In an effort to shed some light on this complex problem, this article focuses on curriculum change through the introduction of new communicative textbooks in an engineering college (kosen) in Japan. First, three key factors that inhibit change are considered and then other factors that specifically hindered change in the kosen environment are identified. A study investigating the attitudes and classroom practices of four Japanese teachers of English highlighted a culture of pedagogical uncertainty and lack of professional support. Suggestions for supporting teachers to implement curriculum change more effectively, both in Japan and elsewhere, are drawn out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we investigate the use of manifold learning techniques to enhance the separation properties of standard graph kernels. The idea stems from the observation that when we perform multidimensional scaling on the distance matrices extracted from the kernels, the resulting data tends to be clustered along a curve that wraps around the embedding space, a behavior that suggests that long range distances are not estimated accurately, resulting in an increased curvature of the embedding space. Hence, we propose to use a number of manifold learning techniques to compute a low-dimensional embedding of the graphs in an attempt to unfold the embedding manifold, and increase the class separation. We perform an extensive experimental evaluation on a number of standard graph datasets using the shortest-path (Borgwardt and Kriegel, 2005), graphlet (Shervashidze et al., 2009), random walk (Kashima et al., 2003) and Weisfeiler-Lehman (Shervashidze et al., 2011) kernels. We observe the most significant improvement in the case of the graphlet kernel, which fits with the observation that neglecting the locational information of the substructures leads to a stronger curvature of the embedding manifold. On the other hand, the Weisfeiler-Lehman kernel partially mitigates the locality problem by using the node labels information, and thus does not clearly benefit from the manifold learning. Interestingly, our experiments also show that the unfolding of the space seems to reduce the performance gap between the examined kernels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quantum Jensen-Shannon divergence kernel [1] was recently introduced in the context of unattributed graphs where it was shown to outperform several commonly used alternatives. In this paper, we study the separability properties of this kernel and we propose a way to compute a low-dimensional kernel embedding where the separation of the different classes is enhanced. The idea stems from the observation that the multidimensional scaling embeddings on this kernel show a strong horseshoe shape distribution, a pattern which is known to arise when long range distances are not estimated accurately. Here we propose to use Isomap to embed the graphs using only local distance information onto a new vectorial space with a higher class separability. The experimental evaluation shows the effectiveness of the proposed approach. © 2013 Springer-Verlag.