In the past two decades, machine learning workloads have been transformed by the availability of programmable, high-throughput numerical accelerator hardware. By contrast, discrete combinatorial optimization research has largely remained the remit of conventional CPUs. We investigate the application of stochastic search to Maximum Satisfiability (MaxSAT) through a procedure designed to leverage the high-throughput matrix multiplication capabilities of modern neural network accelerators. Given a MaxSAT problem instance in conjunctive normal form, our procedure constructs a Restricted Boltzmann Machine with an equilibrium distribution wherein the probability of any Boolean assignment is exponential in the number of clauses it satisfies. We search the space of satisfying assignments using chain-parallel, device-parallel block Gibbs sampling, periodically improving assignments using a classical heuristic. We present timed results on a subset of problem instances from annual MaxSAT
Evaluations for the years 2018 to 2021, as well as an ablation study. Results demonstrate that our simple stochastic search approach can exploit horizontally-scaled accelerator parallelism for tackling challenging combinatorial optimization, suggesting new directions for the study of such problems.