Timezone: »

 
Jason Eisner, "BiLSTM-FSTs and Neural FSTs"
Jason Eisner

Sat Dec 08 02:00 PM -- 02:30 PM (PST) @

How should one apply deep learning to tasks such as morphological reinflection, which stochastically edit one string to get another? Finite-state transducers (FSTs) are a well-understood formalism for scoring such edit sequences, which represent latent hard monotonic alignments. I will discuss options for combining this architecture with neural networks. The BiLSTM-FST scores each edit in its full input context, which preserves the ability to do exact inference over the aligned outputs using dynamic programming. The Neural FST scores each edit sequence using an LSTM, which requires approximate inference via methods such as beam search or particle smoothing. Finally, I will sketch how to use the language of regular expressionsto specify not only the legal edit sequences but also how to present them to the LSTMs.

Author Information

Jason Eisner (Johns Hopkins University)

Jason Eisner is Professor of Computer Science at Johns Hopkins University, as well as Director of Research at Microsoft Semantic Machines. He is a Fellow of the Association for Computational Linguistics. At Johns Hopkins, he is also affiliated with the Center for Language and Speech Processing, the Machine Learning Group, the Cognitive Science Department, and the national Center of Excellence in Human Language Technology. His goal is to develop the probabilistic modeling, inference, and learning techniques needed for a unified model of all kinds of linguistic structure. His 135+ papers have presented various algorithms for parsing, machine translation, and weighted finite-state machines; formalizations, algorithms, theorems, and empirical results in computational phonology; and unsupervised or semi-supervised learning methods for syntax, morphology, and word-sense disambiguation. He is also the lead designer of Dyna, a new declarative programming language that provides an infrastructure for AI research. He has received two school-wide awards for excellence in teaching, as well as recent Best Paper Awards at ACL 2017 and EMNLP 2019.

More from the Same Authors