From Perception to Programs: Regularize, Overparameterize, and Amortize

Published: 20 Jun 2023, Last Modified: 17 Sept 2023Differentiable Almost EverythingEveryoneRevisionsBibTeX
Keywords: Program synthesis, neurosymbolic, learning, reasoning
TL;DR: We develop a framework based around gradient descent for synthesizing symbolic programs that invoke neural networks
Abstract: We develop techniques for synthesizing neurosymbolic programs. Such programs mix discrete symbolic processing with continuous neural computation. We relax this mixed discrete/continuous problem and jointly learn all modules with gradient descent, and also incorporate amortized inference, overparameterization, and a differentiable strategy for penalizing lengthy programs. Collectedly this toolbox improves the stability of gradient-guided program search, and suggests ways of learning both how to parse continuous input into discrete abstractions, and how to process those abstractions via symbolic code
Submission Number: 66
Loading