Recurrent neural network for text classification with multi-task learning

P Liu, X Qiu, X Huang�- arXiv preprint arXiv:1605.05101, 2016 - arxiv.org
arXiv preprint arXiv:1605.05101, 2016arxiv.org
Neural network based methods have obtained great progress on a variety of natural
language processing tasks. However, in most previous works, the models are learned based
on single-task supervised objectives, which often suffer from insufficient training data. In this
paper, we use the multi-task learning framework to jointly learn across multiple related tasks.
Based on recurrent neural network, we propose three different mechanisms of sharing
information to model text with task-specific and shared layers. The entire network is trained�…
Neural network based methods have obtained great progress on a variety of natural language processing tasks. However, in most previous works, the models are learned based on single-task supervised objectives, which often suffer from insufficient training data. In this paper, we use the multi-task learning framework to jointly learn across multiple related tasks. Based on recurrent neural network, we propose three different mechanisms of sharing information to model text with task-specific and shared layers. The entire network is trained jointly on all these tasks. Experiments on four benchmark text classification tasks show that our proposed models can improve the performance of a task with the help of other related tasks.
arxiv.org
Showing the best result for this search. See all results