arXiv:2405.05751v3 [cs.LG] 6 Jun 2025.
Mengdi Wu, Xinhao Cheng, Shengyu Liu†, Chunan Shi†, Jianan Ji, Man Kit Ao, Praveen Velliengiri‡, Xupeng Miao#, Oded Padon*, Zhihao Jia
Carnegie Mellon University
†Peking University
‡Pennsylvania State University
#Purdue University
*Weizmann Institute of Science
We introduce Mirage, the first multi-level superoptimizer for tensor programs. A key idea in Mirage is μGraphs, a uniform representation of tensor programs at the kernel, thread block, and thread levels of the GPU compute hierarchy. μGraphs enable Mirage to discover novel optimizations that combine algebraic transformations, schedule transformations, and generation of new custom kernels. To navigate the large search space, Mirage introduces a pruning technique based on abstraction that significantly reduces the search space and provides a certain optimality guarantee. To ensure that the optimized μGraph is equivalent to the input program, Mirage introduces a probabilistic equivalence verification procedure with strong theoretical guarantees. Our evaluation shows that Mirage significantly outperforms existing approaches even for DNNs that are widely used and heavily optimized. Mirage is publicly available at https://github.com/mirage-project/mirage.
FULL TR: pdf