NeurIPS2020

Program Synthesis

Discussion

Datasets

  1. MiniSCAN [5]: The goal of this domain is to learn compositional, language-like rules from a very limited number of examples.
  2. SCAN [4]: Consists of synthetic language command paired with discrete action sequence. The goal of SCAN is to test the compositional abilities of neural networks when test data varies systematically from training data

Papers

1. Learning Compositional Rules via Neural Program Synthesis

Authors: Nye, Solar-Lezama, Tenenbaum, Lake. NeurIPS 2020.

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%201.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%202.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%203.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%204.png

2. Program Synthesis with Pragmatic Communication

Authors: Pu, Ellis, Kryven, Tenenbaum, Solar-Lezama. NeurIPS 2020.

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%205.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%209.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2010.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2011.png

3. Learning abstract structure for drawing by efficient motor program induction

Authors: Tian, Ellis, Kryven, Tenenbaum. NeurIPS 2020.

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2012.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2013.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2014.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2015.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2016.png

Program%20Synthesis%20966cd25148d34502887de0ec105586dd/a%2017.png

References

  1. Nye, Maxwell I., Armando Solar-Lezama, Joshua B. Tenenbaum, and Brenden M. Lake. "Learning Compositional Rules via Neural Program Synthesis." arXiv preprint arXiv:2003.05562 (2020).
  2. Brenden M Lake. Compositional generalization through meta sequence-to-sequence learning. In Advances in Neural Information Processing Systems, pages 9788–9798, 2019.
  3. Amit Zohar and Lior Wolf. Automatic program synthesis of long programs with a learned garbage collector. In Advances in Neural Information Processing Systems, pages 2094–2103,
  4. Brenden Lake and Marco Baroni. Generalization without systematicity: On the compositional skills of sequence-to-sequence recurrent networks. In 35th International Conference on Machine Learning, ICML 2018, pages 4487–4499. International Machine Learning Society (IMLS), 2018.
  5. Brenden M Lake, Tal Linzen, and Marco Baroni. Human few-shot learning of compositional instructions. Proceedings of the 41st Annual Conference of the Cognitive Science Society, 2019.
  6. Oleksandr Polozov and Sumit Gulwani. Flashmeta: A framework for inductive program synthesis. ACM SIGPLAN Notices, 50(10):107–126, 2015.
  7. Kevin Ellis and Sumit Gulwani. Learning to learn programs from examples: Going beyond program structure. IJCAI, 2017.
  8. Mike Lewis, Denis Yarats, Yann N Dauphin, Devi Parikh, and Dhruv Batra. Deal or no deal? end-to-end learning for negotiation dialogues. arXiv preprint arXiv:1706.05125, 2017.
  9. Pu, Yewen, Kevin Ellis, Marta Kryven, Joshua B Tenenbaum, and Armando Solar-Lezama. “Program Synthesis with Pragmatic Communication,” n.d., 11.
  10. Tian, Lucas Y, Kevin Ellis, Marta Kryven, and Joshua B Tenenbaum. “Learning Abstract Structure for Drawing by Efficient Motor Program Induction,” n.d., 12.