Y. Huang, X. Qiao, P. Ren, S. Dustdar, J. Chen:
"EdgeBooster: Edge-Assisted Real-Time Image Segmentation for the Mobile Web in WoT";
IEEE Internet of Things Journal, Volume 8 (2021), Issue 9; S. 7288 - 7302.

Kurzfassung englisch:
Combining image segmentation with Web technology lays a good foundation for lightweight, cross-platform, and pervasive Web artificial intelligence applications, and further improves the capability of Web-of-Things (WoT) applications. However, no matter whether we use a Web real-time communication media server for advanced processing that views camera inputs as a video stream, or transfer continuous camera frames to the remote cloud for processing, we are unable to obtain a satisfactory real-time experience due to high resource consumption and unacceptable latency. In this article, we present EdgeBooster, a computational-efficient architecture that leverages a common edge server to minimize the communication costs, accelerates the camera frame segmentation, and guarantees an acceptable segmentation accuracy with the prior knowledge. EdgeBooster provides real-time segmentation by developing parallel technology that enables segmentation on slices of a camera frame and using presegmentation based on superpixels to accelerate the graph-based segmentation. It also introduces recent DNN-based segmentation results as the prior knowledge to improve the performance of the graph-based segmentation, especially in nonideal scenes, such as dark light and weak contrast. Finally, it creates a pure frontend segmentation that can provide continuous and stable services for mobile users in unstable networks, such as a weak network or with an unstable edge server. The experimental results show that EdgeBooster is able to achieve a considerable accuracy for the mobile Web, running at no less than 30 frames per second in real scenes.

Edge computing, image segmentation, mobile Web, Web-of-Things (WoT) applications

"Offizielle" elektronische Version der Publikation (entsprechend ihrem Digital Object Identifier - DOI)

Erstellt aus der Publikationsdatenbank der Technischen Universitšt Wien.