[PDF][PDF] CSE: Conceptual sentence embeddings based on attention model

Y Wang, HY Huang, C Feng, Q Zhou…�- Proceedings of the�…, 2016 - aclanthology.org
Y Wang, HY Huang, C Feng, Q Zhou, J Gu, X Gao
Proceedings of the 54th Annual Meeting of the Association for�…, 2016aclanthology.org
Most sentence embedding models typically represent each sentence only using word
surface, which makes these models indiscriminative for ubiquitous homonymy and
polysemy. In order to enhance representation capability of sentence, we employ
conceptualization model to assign associated concepts for each sentence in the text corpus,
and then learn conceptual sentence embedding (CSE). Hence, this semantic representation
is more expressive than some widely-used text representation models such as latent topic�…
Abstract
Most sentence embedding models typically represent each sentence only using word surface, which makes these models indiscriminative for ubiquitous homonymy and polysemy. In order to enhance representation capability of sentence, we employ conceptualization model to assign associated concepts for each sentence in the text corpus, and then learn conceptual sentence embedding (CSE). Hence, this semantic representation is more expressive than some widely-used text representation models such as latent topic model, especially for short-text. Moreover, we further extend CSE models by utilizing a local attention-based model that select relevant words within the context to make more efficient prediction. In the experiments, we evaluate the CSE models on two tasks, text classification and information retrieval. The experimental results show that the proposed models outperform typical sentence embed-ding models.
aclanthology.org
Showing the best result for this search. See all results