Affiliation:
1. University of Pennsylvania, Philadelphia, USA
Abstract
We consider the problem of synthesizing programs with numerical constants that optimize a quantitative objective, such as accuracy, over a set of input-output examples. We propose a general framework for optimal synthesis of such programs in a given domain specific language (DSL), with provable optimality guarantees. Our framework enumerates programs in a general search graph, where nodes represent subsets of concrete programs. To improve scalability, it uses
A
*
search in conjunction with a search heuristic based on abstract interpretation; intuitively, this heuristic establishes upper bounds on the value of subtrees in the search graph, enabling the synthesizer to identify and prune subtrees that are provably suboptimal. In addition, we propose a natural strategy for constructing abstract transformers for monotonic semantics, which is a common property for components in DSLs for data classification. Finally, we implement our approach in the context of two such existing DSLs, demonstrating that our algorithm is more scalable than existing optimal synthesizers.
Publisher
Association for Computing Machinery (ACM)
Reference41 articles.
1. Greg Anderson, Abhinav Verma, Isil Dillig, and Swarat Chaudhuri. 2020. Neurosymbolic reinforcement learning with formally verified exploration. Advances in neural information processing systems, 33 (2020), 6172–6183.
2. Binary Translation Using Peephole Superoptimizers;Bansal Sorav;OSDI.,2008
3. MIRIS: Fast Object Track Queries in Video
4. SkyQuery: an aerial drone video sensing platform
5. Optimizing synthesis with metasketches