Neural Guided Constraint Logic Programming for Program Synthesis

Synthesizing programs using example input/outputs is a classic problem in artificial intelligence. We present a method for solving Programming By Example (PBE) problems by using a neural model to guide the search of a constraint logic programming system called miniKanren. Crucially, the neural model uses miniKanren's internal representation as input; miniKanren represents a PBE problem as recursive constraints imposed by the provided examples. We explore Recurrent Neural Network and Graph Neural Network models. We contribute a modified miniKanren, drivable by an external agent, available at this https URL We show that our neural-guided approach using constraints can synthesize programs faster in many cases, and importantly, can generalize to larger problems.

[1]  Sumit Gulwani,et al.  Automating string processing in spreadsheets using input-output examples , 2011, POPL '11.

[2]  Richard S. Zemel,et al.  Gated Graph Sequence Neural Networks , 2015, ICLR.

[3]  Pushmeet Kohli,et al.  TerpreT: A Probabilistic Programming Language for Program Induction , 2016, ArXiv.

[4]  Phillip D. Summers,et al.  A Methodology for LISP Program Construction from Examples , 1977, J. ACM.

[5]  Sumit Gulwani,et al.  Neural-Guided Deductive Search for Real-Time Program Synthesis from Examples , 2018, ICLR.

[6]  Pushmeet Kohli,et al.  Deep API Programmer: Learning to Program with APIs , 2017, ArXiv.

[7]  D. Friedman,et al.  From Variadic Functions to Variadic Relations A miniKanren Perspective , 2006 .

[8]  Quoc V. Le,et al.  Neural Programmer: Inducing Latent Programs with Gradient Descent , 2015, ICLR.

[9]  Alan W. Biermann,et al.  The Inference of Regular LISP Programs from Examples , 1978, IEEE Transactions on Systems, Man, and Cybernetics.

[10]  David L. Dill,et al.  Learning a SAT Solver from Single-Bit Supervision , 2018, ICLR.

[11]  Marc Brockschmidt,et al.  Learning to Represent Programs with Graphs , 2017, ICLR.

[12]  Sumit Gulwani,et al.  Recursive Program Synthesis , 2013, CAV.

[13]  Peter-Michael Osera,et al.  Type-and-example-directed program synthesis , 2015, PLDI.

[14]  Sebastian Nowozin,et al.  DeepCoder: Learning to Write Programs , 2016, ICLR.

[15]  GulwaniSumit Automating string processing in spreadsheets using input-output examples , 2011 .

[16]  Alex Graves,et al.  Neural Turing Machines , 2014, ArXiv.

[17]  Dawn Xiaodong Song,et al.  Tree-to-tree Neural Networks for Program Translation , 2018, NeurIPS.

[18]  Lihong Li,et al.  Neuro-Symbolic Program Synthesis , 2016, ICLR.

[19]  Pushmeet Kohli,et al.  RobustFill: Neural Program Learning under Noisy I/O , 2017, ICML.

[20]  Jason Hemann Daniel P. Friedman Kanren : A Minimal Functional Core for Relational Programming , 2013 .

[21]  Jürgen Schmidhuber,et al.  Long Short-Term Memory , 1997, Neural Computation.

[22]  Tim Rocktäschel,et al.  End-to-end Differentiable Proving , 2017, NIPS.

[23]  Isil Dillig,et al.  Synthesizing data structure transformations from input-output examples , 2015, PLDI.

[24]  William E. Byrd,et al.  A unified approach to solving seven programming problems (functional pearl) , 2017, Proc. ACM Program. Lang..

[25]  Ah Chung Tsoi,et al.  The Graph Neural Network Model , 2009, IEEE Transactions on Neural Networks.

[26]  Armando Solar-Lezama,et al.  Sampling for Bayesian Program Learning , 2016, NIPS.

[27]  William E. Byrd,et al.  miniKanren, live and untagged: quine generation via relational interpreters (programming pearl) , 2012, Scheme '12.

[28]  Dawn Xiaodong Song,et al.  Towards Synthesizing Complex Programs From Input-Output Examples , 2017, ICLR.

[29]  Samy Bengio,et al.  Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks , 2015, NIPS.

[30]  Tom Schaul,et al.  Prioritized Experience Replay , 2015, ICLR.