作者
Massinissa Merouani, Mohamed-Hicham Leghettas, Riyadh Baghdadi, Taha Arbaoui, Karima Benatchba
发表日期
2020/10
机构
PhD thesis, 10 2020
简介
Programmers spend a lot of time and effort optimizing their code to make it run faster, this has led compiler researchers to focus on developing automatic optimization techniques that allow to automatically improve program performance. Such techniques aim to transform programs to exploit more efficiently the underlying hardware. Efficient implementations of automatic optimization techniques require precise cost models, they evaluate whether applying a sequence of code transformations reduces the execution time of the program. Building an analytical cost model to do so is hard in modern x86 architectures due to the complexity of the microarchitecture. In this work, we present a novel deep learning-based cost model for automatic code optimization. This cost model is integrated into an auto-scheduler that enables Tiramisu compiler to select the best code transformations for a given program. The input of the proposed model is a tree-structured set of high-level features representing the unoptimized code along with a sequence of code transformations, the cost model predicts the speedup expected when the code transformations are applied. To train the model, we built a 1.8 million points dataset consisting of transformed synthetic Tiramisu programs. The proposed model achieves a low error rate of 16% MAPE on predicting speedups. Furthermore, the model is proven effective at ranking code transformation candidates with an nDCG score of 98%. Experiments on real-world image processing, deep learning, and scientific computing programs show that the auto-scheduler finds competitive solutions 95 times faster when it uses the proposed cost …
引用总数
2020202120222023202411424
学术搜索中的文章