[Back]


Contributions to Proceedings:

B. Haas, A. Wendt, A. Jantsch, M. Wess:
"Neural Network Compression Through Shunt Connections and Knowledge Distillation for Semantic Segmentation Problems";
in: "Artificial Intelligence Applications and Innovations, 17th IFIP WG 12.5 International Conference", issued by: Springer Nature; Springer Nature Switzerland AG, Greece, 2021, ISBN: 978-3-030-79149-0, 349 - 361.



English abstract:
Employing convolutional neural network models for large
scale datasets represents a big challenge. Especially embedded devices
with limited resources cannot run most state-of-the-art model architectures
in real-time, necessary for many applications. This paper proves
the applicability of shunt connections on large scale datasets and narrows
this computational gap. Shunt connections is a proposed method
for MobileNet compression. We are the first to provide results of shunt
connections for the MobileNetV3 model and for segmentation tasks on
the Cityscapes dataset, using the DeeplabV3 architecture, on which we
achieve compression by 28%, while observing a 3.52 drop in mIoU. The
training of shunt-inserted models are optimized through knowledge distillation.
The full code used for this work will be available online.

Keywords:
Shunt connections, Knowledge distillation, Optimization, Latency, Accuracy, CIFAR, Cityscapes, DeepLab, MobileNet, Machine learning, Embedded machine learning


"Official" electronic version of the publication (accessed through its Digital Object Identifier - DOI)
http://dx.doi.org/10.1007/978-3-030-79150-6

Electronic version of the publication:
https://publik.tuwien.ac.at/files/publik_296405.pdf


Created from the Publication Database of the Vienna University of Technology.