34 resultados para Recurrent neural network


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel low-complexity artificial neural network (ANN)-based nonlinear equalizer (NLE) for coherent optical orthogonal frequency-division multiplexing (CO-OFDM) and compare it with the recent inverse Volterra-series transfer function (IVSTF)-based NLE over up to 1000 km of uncompensated links. Demonstration of ANN-NLE at 80-Gb/s CO-OFDM using 16-quadrature amplitude modulation reveals a Q-factor improvement after 1000-km transmission of 3 and 1 dB with respect to the linear equalization and IVSTF-NLE, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel artificial neural network (ANN)-based nonlinear equalizer (NLE) of low complexity is demonstrated for 40-Gb/s CO-OFDM at 2000 km, revealing ∼1.5 dB enhancement in Q-factor compared to inverse Volterra-series transfer function based NLE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the problem of detecting sentences describing adverse drug reactions (ADRs) and frame the problem as binary classification. We investigate different neural network (NN) architectures for ADR classification. In particular, we propose two new neural network models, Convolutional Recurrent Neural Network (CRNN) by concatenating convolutional neural networks with recurrent neural networks, and Convolutional Neural Network with Attention (CNNA) by adding attention weights into convolutional neural networks. We evaluate various NN architectures on a Twitter dataset containing informal language and an Adverse Drug Effects (ADE) dataset constructed by sampling from MEDLINE case reports. Experimental results show that all the NN architectures outperform the traditional maximum entropy classifiers trained from n-grams with different weighting strategies considerably on both datasets. On the Twitter dataset, all the NN architectures perform similarly. But on the ADE dataset, CNN performs better than other more complex CNN variants. Nevertheless, CNNA allows the visualisation of attention weights of words when making classification decisions and hence is more appropriate for the extraction of word subsequences describing ADRs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In product reviews, it is observed that the distribution of polarity ratings over reviews written by different users or evaluated based on different products are often skewed in the real world. As such, incorporating user and product information would be helpful for the task of sentiment classification of reviews. However, existing approaches ignored the temporal nature of reviews posted by the same user or evaluated on the same product. We argue that the temporal relations of reviews might be potentially useful for learning user and product embedding and thus propose employing a sequence model to embed these temporal relations into user and product representations so as to improve the performance of document-level sentiment analysis. Specifically, we first learn a distributed representation of each review by a one-dimensional convolutional neural network. Then, taking these representations as pretrained vectors, we use a recurrent neural network with gated recurrent units to learn distributed representations of users and products. Finally, we feed the user, product and review representations into a machine learning classifier for sentiment classification. Our approach has been evaluated on three large-scale review datasets from the IMDB and Yelp. Experimental results show that: (1) sequence modeling for the purposes of distributed user and product representation learning can improve the performance of document-level sentiment classification; (2) the proposed approach achieves state-of-The-Art results on these benchmark datasets.