Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Incentive-Aware Partitioning and Offloading Scheme for Inference Services in Edge Computing

Full metadata record
DC Field Value Language
dc.contributor.authorKim TaeYoung-
dc.contributor.author김창경-
dc.contributor.authorLee Seung-seob-
dc.contributor.authorLee Sukyoung-
dc.date.accessioned2025-03-20T02:50:37Z-
dc.date.available2025-03-20T02:50:37Z-
dc.date.issued2024-07-
dc.identifier.issn1939-1374-
dc.identifier.urihttps://yscholarhub.yonsei.ac.kr/handle/2021.sw.yonsei/23221-
dc.description.abstractOwing to remarkable improvements in deep neural networks (DNNs), various computation-intensive and delay-sensitive DNN services have been developed for smart IoT devices. However, employing these services on the devices is challenging due to their limited battery capacity and computational constraints. Although edge computing is proposed as a solution, edge devices cannot meet the performance requirements of DNN services because the majority of IoT applications require simultaneous inference services, and DNN models grow larger. To address this problem, we propose a framework that enables parallel execution of partitioned and offloaded DNN inference services over multiple distributed edge devices. Noteworthy, edge devices are reluctant to process tasks due to their energy consumption. Thus, to provide an incentive mechanism for edge devices, we model the interaction between the edge devices and DNN inference service users as a two-level Stackelberg game. Based on this model, we design the proposed framework to determine the optimal scheduling with a partitioning strategy, aiming to maximize user satisfaction while incentivizing the participation of edge devices. We further derive the Nash equilibrium points in the two levels. The simulation results show that the proposed scheme outperforms other benchmark methods in terms of user satisfaction and profits of edge devices.-
dc.format.extent13-
dc.publisherInstitute of Electrical and Electronics Engineers-
dc.titleIncentive-Aware Partitioning and Offloading Scheme for Inference Services in Edge Computing-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1109/TSC.2024.3359148-
dc.identifier.wosid001290231100025-
dc.identifier.bibliographicCitationIEEE Transactions on Services Computing, v.17, no.4, pp 1580 - 1592-
dc.citation.titleIEEE Transactions on Services Computing-
dc.citation.volume17-
dc.citation.number4-
dc.citation.startPage1580-
dc.citation.endPage1592-
dc.description.isOpenAccessN-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Computing > 인공지능융합대학 첨단컴퓨팅학부 > 인공지능융합대학 컴퓨터과학과 > 1. Journal Articles

qrcode

Items in Scholar Hub are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Kim, Chang Kyung photo

Kim, Chang Kyung
인공지능융합대학 (인공지능융합대학 컴퓨터과학과)
Read more

Altmetrics

Total Views & Downloads

BROWSE