Optimized Cooperative Inference for Energy-Efficient and Low-Latency Mobile Edge Computing

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

To overcome the limitations of standalone inference on edge devices or servers, we propose a cooperative inference method for mobile edge computing (MEC) systems. Using dual confidence thresholds on a small neural network (NN) at the edge, ambiguous images are filtered and sent to a larger NN on the server for reevaluation. We evaluate the method's accuracy, delay, and energy consumption, accounting for confidence score distributions that could trigger false alarms. A joint optimization problem is formulated to minimize delay and energy consumption by selecting optimal confidence thresholds, transmit power, and duty cycle while ensuring accuracy. Experimental results show that this approach significantly reduces delay and energy consumption while achieving higher accuracy than device-only inference and lower costs than server-only inference in various MEC scenarios.

Original languageEnglish
Title of host publication39th International Conference on Information Networking, ICOIN 2025
PublisherIEEE Computer Society
Pages642-647
Number of pages6
ISBN (Electronic)9798331506940
DOIs
StatePublished - 2025
Event39th International Conference on Information Networking, ICOIN 2025 - Chiang Mai, Thailand
Duration: 15 Jan 202517 Jan 2025

Publication series

NameInternational Conference on Information Networking
ISSN (Print)1976-7684

Conference

Conference39th International Conference on Information Networking, ICOIN 2025
Country/TerritoryThailand
CityChiang Mai
Period15/01/2517/01/25

Keywords

  • confidence thresholds
  • Cooperative inference
  • joint optimization
  • mobile edge computing (MEC)

Fingerprint

Dive into the research topics of 'Optimized Cooperative Inference for Energy-Efficient and Low-Latency Mobile Edge Computing'. Together they form a unique fingerprint.

Cite this