Genetic programming is a classic discrete optimization technique with numerous applications in science and engineering. In this talk, I will present a new way of enhancing genetic programming with large language models (LLMs). Our approach maintains a pool of programs like traditional evolutionary approaches but uses zero-shot queries to an LLM to discover and evolve abstract concepts occurring in known high-fitness programs. We discover new programs using a mix of standard evolutionary steps and LLM-guided steps conditioned on discovered concepts. Once discovered, programs are used in a new round of concept abstraction and evolution. We evaluate the approach in two settings: symbolic regression and descriptor-based image classification. In both settings, the concept-guided search substantially outperforms state-of-the-art baselines.
Back to Workshop III: Naturalistic Approaches to Artificial Intelligence