Download PDFOpen PDF in browserTripod: Learning Latent Representations for SequencesEasyChair Preprint 31969 pages•Date: April 18, 2020AbstractWe propose a new model for learning and extracting latent representations from sequences, which generates a tripartite representation: global style, memory-based and summary-based partitions. We show the relevance of these representations on a couple of mainstream tasks such as text similarity and natural language inference. We argue that the generic nature of this approach makes it applicable to many other tasks that involve modelling of discreet-valued sequences (time-ordered) and, with some modifications even to image and speech processing. We encourage everyone to try our opensource code and our Python3 API Keyphrases: Natural Language Processing, Paragraphs, embeddings, latent representations, machine learning, sentences, sequence
|