As the frontiers of artificial intelligence advance more rapidly than ever before, generative language models like GPT are already causing significant economic and social transformation. In addition to their remarkable performance on typical language tasks (such as generating text from a prompt), language models are being rapidly adopted as powerful ansatze states for quantum many-body systems. In this talk, I will discuss the use of language models for learning quantum states realized in experimental Rydberg atom arrays. By combining variational optimization with data-driven learning using qubit projective measurements, I will show how language models are poised to become one of the most powerful computational tools in our arsenal for the design and characterization of quantum simulators and computers.
Back to Workshop II: Mathematical Aspects of Quantum Learning