Journal of East China Normal University(Natural Sc ›› 2019, Vol. 2019 ›› Issue (5): 113-122,167.doi: 10.3969/j.issn.1000-5641.2019.05.009

• Computational Intelligence in Emergent Applications • Previous Articles     Next Articles

Self-attention based neural networks for product titles compression

FU Yu, LI You, LIN Yu-ming, ZHOU Ya   

  1. Guangxi Key Laboratory of Trusted Software, Guilin University of Electronic Technology, Guilin Guangxi 541004, China
  • Received:2019-07-28 Online:2019-09-25 Published:2019-10-11

Abstract: E-commerce product title compression has received significant attention in recent years, since it can facilitate more specific information for cross-platform knowledge alignment and multi-source data fusion. Product titles usually contain redundant descriptions, which can lead to inconsistencies. In this paper, we propose self-attention based neural networks for this task. Given the fact that self-attention mechanism networks cannot directly capture sequence features of product names, we enhance the mapping networks with a dot-attention structure, which was computed for the query and key-value pairs by a gated recurrent unit (GRU) based recurrent neural network. The proposed method improves the analytical capability of the model at a lower relative computational cost. Based on data from LESD4EC, we built two E-commerce datasets of product core phrases named LESD4EC L and LESD4EC S; we subsequently tested the model on these two datasets. A series of experiments show that the proposed model achieves better performance in product title compression than existing techniques.

Key words: self-attention mechanism, product titles compression, gated recurrent units

CLC Number: