Timezone: »

MorphTE: Injecting Morphology in Tensorized Embeddings
Guobing Gan · Peng Zhang · Sunzhu Li · Xiuqing Lu · Benyou Wang

Thu Dec 08 05:00 PM -- 07:00 PM (PST) @
In the era of deep learning, word embeddings are essential when dealing with text tasks. However, storing and accessing these embeddings requires a large amount of space. This is not conducive to the deployment of these models on resource-limited devices. Combining the powerful compression capability of tensor products, we propose a word embedding compression method with morphological augmentation, Morphologically-enhanced Tensorized Embeddings (MorphTE). A word consists of one or more morphemes, the smallest units that bear meaning or have a grammatical function. MorphTE represents a word embedding as an entangled form of its morpheme vectors via the tensor product, which injects prior semantic and grammatical knowledge into the learning of embeddings. Furthermore, the dimensionality of the morpheme vector and the number of morphemes are much smaller than those of words, which greatly reduces the parameters of the word embeddings. We conduct experiments on tasks such as machine translation and question answering. Experimental results on four translation datasets of different languages show that MorphTE can compress word embedding parameters by about $20$ times without performance loss and significantly outperforms related embedding compression methods.

Author Information

Guobing Gan (Tianjin University)
Peng Zhang (Tianjin University)
Sunzhu Li (Tianjin University)
Xiuqing Lu (Tianjin University)
Benyou Wang (Universita' degli studi di Padova)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors