Skip to yearly menu bar Skip to main content


Live Demo
in
Demonstration: Demonstrations 3

PYLON: A PyTorch Framework for Learning with Constraints

Kareem Ahmed · Tao Li · Nu Mai Thy Ton · Quan Guo · Kai-Wei Chang · Parisa Kordjamshidi · Vivek Srikumar · Guy Van den Broeck · Sameer Singh


Abstract:

Deep learning excels at learning task information from large amounts of data, however, struggles with learning from declarative high-level knowledge that can be more succinctly expressed directly. In this work, we introduce PYLON, a neural-symbolic training framework that builds on PyTorch to augment imperatively trained models with declaratively specified knowledge. PYLON lets users programmatically specify constraints as Python functions and compiles them into a differentiable loss, thus training predictive models that fit the data whilst satisfying the specified constraints. PYLON includes both exact as well as approximate compilers to efficiently compute the loss, employing fuzzy logic, sampling methods,and circuits, ensuring scalability even to complex models and constraints. Crucially, a guiding principle in designing PYLON is the ease with which any existing deep learning codebase can be extended to learn from constraints using only a few lines: a function that expresses the constraint and code to incorporate it as a loss. Our demo comprises of models in NLP, computer vision, logical games, and knowledge graphs that can be interactively trained using constraints as supervision.