Code artifacts and data for CS 590 project.
The project involves several changes to the TVM deep learning compiler. These changes can be found at my fork. To run experiments, please clone the fork and build from source following these instructions.
Experiments include:
-
exp1.sh
,exp2.sh
,exp3.sh
: these scripts summarize paper experiments.- Autotuning commands are commented by default due to their long running time.
- Individual scripts were adapted from AutoTVM tutorials. Explanations:
-
tune_conv2d_cuda.py
- Tune a specific
conv2d
operator configuration.
- Tune a specific
-
tune_conv2d_cuda_test.py
- Evaluate performance of tuned
conv2d
configuration stored in log file.
- Evaluate performance of tuned
-
feature_experiments_resnet18/tune_nnvm_cuda.py
- Tune ResNet-18 inference (12
conv2d
configurations total).
- Tune ResNet-18 inference (12
-
transfer-learning/tune_conv2d_cuda_transfer.py
- Tune
conv2d
operator using pretrained data for transfer learning.
- Tune
-
- Incomplete neural network cost model experiments in
nn-cost-model
andtreernn-cost-model
.
Autotuning result files are also included, as autotuning execution takes many hours.
-
baseline
,feature_experiments_c7
,feature_experiments_c12
:conv2d
tuning results. -
feature_experiments_resnet18
: end-to-end ResNet-18 tuning results. -
transfer-learning
:conv2d
transfer learning results.
Raw, unpolished data can be found on the raw-data
branch.
Figures in paper are generated via plot.sh
.