BriefGPT.xyz
Jul, 2019
使用移动GPU进行设备端神经网络推断
On-Device Neural Net Inference with Mobile GPUs
HTML
PDF
Juhyun Lee, Nikolay Chirkov, Ekaterina Ignasheva, Yury Pisarchyk, Mogan Shieh...
TL;DR
本文介绍了如何利用手机上普遍存在的GPU加速器,在Android和iOS设备上实现深度神经网络的实时推断,并将其集成到开源项目TensorFlow Lite中。
Abstract
On-device inference of
machine learning
models for
mobile phones
is desirable due to its lower latency and increased privacy. Running such a compute-intensive task solely on the mobile CPU, however, can be diffic
→