65 resultados para Julie Cunningham
Resumo:
We propose a new practical multimode fiber optical launch scheme, providing near single mode group excitation for >5 times transmission bandwidth improvement. Equalization-free transmission of a 10-Gb/s signal over 220-m fiber is achieved in experimental demonstrations. © 2010 Optical Society of America.
Resumo:
100 Gb/s PAM4-CAP2 modulation is demonstrated for next-generation datacommunication links. Simulation studies indicate a power budget advantage of 2.5 dBo relative to PAM8 modulation. A real-time experimental demonstration is performed. © OSA 2014.
Resumo:
A novel twin-spot launch is proposed for multimode-fiber (MMF) links. Experimental and theoretical investigation of the launch indicates a penalty reduction of ≈50% of the 10 Gigabit Ethernet allocation for EDC-enabled links over worst-case MMF. © 2007 Optical Society of America.
Resumo:
We experimentally demonstrate the first optical data link at 20Gb/s using hybrid CAP- 4/QAM-4 with transmission over 4.3km SSMF and a power penalty ~1.5dBo at BER=10-9. The hybrid CAP-4/QAM-4 link significantly outperforms a reference PAM-4 link. © OSA 2013.
Resumo:
© 2015 John P. Cunningham and Zoubin Ghahramani. Linear dimensionality reduction methods are a cornerstone of analyzing high dimensional data, due to their simple geometric interpretations and typically attractive computational properties. These methods capture many data features of interest, such as covariance, dynamical structure, correlation between data sets, input-output relationships, and margin between data classes. Methods have been developed with a variety of names and motivations in many fields, and perhaps as a result the connections between all these methods have not been highlighted. Here we survey methods from this disparate literature as optimization programs over matrix manifolds. We discuss principal component analysis, factor analysis, linear multidimensional scaling, Fisher's linear discriminant analysis, canonical correlations analysis, maximum autocorrelation factors, slow feature analysis, sufficient dimensionality reduction, undercomplete independent component analysis, linear regression, distance metric learning, and more. This optimization framework gives insight to some rarely discussed shortcomings of well-known methods, such as the suboptimality of certain eigenvector solutions. Modern techniques for optimization over matrix manifolds enable a generic linear dimensionality reduction solver, which accepts as input data and an objective to be optimized, and returns, as output, an optimal low-dimensional projection of the data. This simple optimization framework further allows straightforward generalizations and novel variants of classical methods, which we demonstrate here by creating an orthogonal-projection canonical correlations analysis. More broadly, this survey and generic solver suggest that linear dimensionality reduction can move toward becoming a blackbox, objective-agnostic numerical technology.