Content-Aware Adaptive Device-Cloud Collaborative Inference for Object Detection
METADATA ONLY
Loading...
Author / Producer
Date
2023-11-01
Publication Type
Journal Article
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
Many intelligent applications based on deep neural networks (DNNs) are increasingly running on Internet of Things (IoT) devices. Unfortunately, the computing resources of these IoT devices are limited, which will seriously hinder the widespread deployment of various smart applications. A popular solution is to offload part of computation tasks from the IoT device to cloud by way of device-cloud collaboration. However, existing collaboration approaches may suffer from long network transmission delay or degraded accuracy due to the large amount of intermediate results, bring enormous challenges to the tasks, such as object detection, that require massive computing resources. In this article, we propose an efficient device-cloud collaborative inference (DCCI) object detection framework, which dynamically adjusts the amount of transferred data according to the content of input images. Specifically, a content-aware hard-case discriminator is proposed to automatically classify the input images as hard-cases or simple-cases, the hard-cases are uploaded to the cloud to be processed by a deployed heavyweight model, and the simple cases are processed by a lightweight model deployed to the IoT device, where the lightweight model is automatically compressed based on reinforcement learning according to the resource constraints of the IoT device. Furthermore, a collaborative scheduler based on the runtime load and network transmission capability of IoT devices is proposed to optimize the collaborative computation between IoT devices and the cloud. Extensive experimental evaluations show that compared to the Device-only approach, DCCI can reduce the memory footprint and compute resources of IoT devices by more than 90.0% and 30.87%, respectively. Compared to Cloud-centric, DCCI can save 2.0× of network bandwidth. In addition, compared with the state-of-the-art DNN partitioning method, DCCI can save 1.2× of inference latency, and 1.3× of IoT device energy consumption with the same accuracy constraint.
Permanent link
Publication status
published
External links
Editor
Book title
Journal / series
Volume
10 (21)
Pages / Article No.
19087 - 19101
Publisher
IEEE
Event
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Adaptive inference; Deep neural network (DNN); Device–cloud collaborative; Object detection