Affiliation:
1. School of Software, Henan Polytechnic University, Jiaozuo, Henan, China
Abstract
The Boolean satisfiability (SAT) problem exhibits different structural features in various domains. Neural network models can be used as more generalized algorithms that can be learned to solve specific problems based on different domain data than traditional rule-based approaches. How to accurately identify these structural features is crucial for neural networks to solve the SAT problem. Currently, learning-based SAT solvers, whether they are end-to-end models or enhancements to traditional heuristic algorithms, have achieved significant progress. In this article, we propose TG-SAT, an end-to-end framework based on Transformer and gated recurrent neural network (GRU) for predicting the satisfiability of SAT problems. TG-SAT can learn the structural features of SAT problems in a weakly supervised environment. To capture the structural information of the SAT problem, we encodes a SAT problem as an undirected graph and integrates GRU into the Transformer structure to update the node embeddings. By computing cross-attention scores between literals and clauses, a weighted representation of nodes is obtained. The model is eventually trained as a classifier to predict the satisfiability of the SAT problem. Experimental results demonstrate that TG-SAT achieves a 2%–5% improvement in accuracy on random 3-SAT problems compared to NeuroSAT. It also outperforms in SR(N), especially in handling more complex SAT problems, where our model achieves higher prediction accuracy.
Funder
The National Natural Science Foundation of Chinaunder
Young Elite Teachers in Henan Province
Doctor Foundation of Henan Polytechnic University
Innovative and Scientifc Research Team of Henan Polvtechnic University
Reference33 articles.
1. Learning to solve circuit-SAT: an unsupervised differentiable approach;Amizadeh,2018
2. Machine learning for combinatorial optimization: a methodological tour d’horizon;Bengio;European Journal of Operational Research,2021
3. Graph neural networks and boolean satisfiability;Bünz,2017
4. Learning phrase representations using RNN Encoder–Decoder for statistical machine translation;Cho,2014
5. The complexity of theorem-proving procedures;Cook,1971