BriefGPT.xyz
Oct, 2020
通过深度神经网络的自适应模型分割辅助校准边缘推断卸载
Calibration-Aided Edge Inference Offloading via Adaptive Model Partitioning of Deep Neural Networks
HTML
PDF
Roberto G. Pacheco, Rodrigo S. Couto, Osvaldo Simeone
TL;DR
该研究针对移动设备上的深度神经网络的推断,使用自适应模型划分的方法解决了通信延迟的问题,并对准确性进行了预测和校准,从而实现更可靠的推断决策。
Abstract
mobile devices
can offload
deep neural network
(DNN)-based
inference
to the cloud, overcoming local hardware and energy limitations. Howev
→