35 resultados para margin


Relevância:

10.00% 10.00%

Publicador:

Resumo:

LED-based carrierless amplitude and phase modulation is investigated for a multi-gigabit plastic optical fibre link. An FPGA-based 1.5 Gbit/s error free transmission over 50 m standard SI-POF using CAP64 is achieved, providing 2.9 dB power margin without forward error correction. © 2012 OSA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulations have investigated single laser 100G Ethernet links enabled by CAP-16 using QAM receivers that not only lower significantly system timing jitter sensitivity but also outperform PAM and standard CAP in terms of power margin. © 2013 OSA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large margin criteria and discriminative models are two effective improvements for HMM-based speech recognition. This paper proposed a large margin trained log linear model with kernels for CSR. To avoid explicitly computing in the high dimensional feature space and to achieve the nonlinear decision boundaries, a kernel based training and decoding framework is proposed in this work. To make the system robust to noise a kernel adaptation scheme is also presented. Previous work in this area is extended in two directions. First, most kernels for CSR focus on measuring the similarity between two observation sequences. The proposed joint kernels defined a similarity between two observation-label sequence pairs on the sentence level. Second, this paper addresses how to efficiently employ kernels in large margin training and decoding with lattices. To the best of our knowledge, this is the first attempt at using large margin kernel-based log linear models for CSR. The model is evaluated on a noise corrupted continuous digit task: AURORA 2.0. © 2013 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

LED-based carrierless amplitude and phase modulation is investigated for a multi-gigabit plastic optical fibre link. An FPGA-based 1.5 Gbit/s error free transmission over 50 m standard SI-POF using CAP64 is achieved, providing 2.9 dB power margin without forward error correction. © 2012 Optical Society of America.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.