Energy-aware task scheduling and offloading using deep reinforcement learning in SDN-enabled IoT network - LAAS - Laboratoire d'Analyse et d'Architecture des Systèmes Accéder directement au contenu
Article Dans Une Revue Computer Networks Année : 2022

Energy-aware task scheduling and offloading using deep reinforcement learning in SDN-enabled IoT network

Résumé

5G mobile network services have made tremendous growth in the IoT network. As a result, a counters number of battery-powered IoT devices are deployed to serve diverse scenarios, e.g., smart cities, autonomous farming, smart manufacturing, to name but a few. In this context, energy consumption became one of the most critical concerns in interconnecting smart IoT devices in such scenarios. Additionally, whenever these IoT devices are distributed in space and time-evolving, they are expected to deliver high volume data scalably/predictably while minimizing end-to-end latency. Furthermore, edge IoT nodes often face the biggest hurdle of performing optimal resource distribution and achieving high-performance levels while coping with task handling, energy conservation, and ultra-reliable low-latency variability. This paper investigates an energy-aware and low-latency oriented computing task scheduling problem in a Software-Defined Fog-IoT Network. We formulate the online task assignment and scheduling problem as an energy-constrained Deep Q-Learning process as a kickoff. The latter strives to minimize the network latency while ensuring energy efficiency by saving battery power under the constraints of application dependence. Then, given the task arrival process, we introduce a deep reinforcement learning (DRL) approach for dynamic task scheduling and assignment in SDN-enabled edge networks. We conducted comprehensive experiments and compared the presented algorithm to three pioneering deep learning algorithms (i.e., deterministic, random, and A3C agents). Extensive simulation results demonstrated that our proposed solution outperforms these algorithms. Additionally, we highlight the characterizing feature of our design, energy-awareness, as it offers better energy-saving by up to 87% compared against the other approaches. We have shown that the offloading scheme could perform more tasks with the available battery power by up to 50% more minor time delay. Our results support our claims that the solution we propose can readily be used to dynamically optimize task scheduling and assignment of complex jobs with task dependencies in distributed Fog IoT networks.
Fichier principal
Vignette du fichier
Computer_Networks_2.pdf (539.74 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03648574 , version 1 (21-04-2022)
hal-03648574 , version 2 (28-04-2022)

Identifiants

Citer

Bassem Sellami, Akram Hakiri, Sadok Ben Yahia, Pascal Berthou. Energy-aware task scheduling and offloading using deep reinforcement learning in SDN-enabled IoT network. Computer Networks, 2022, 210, pp.108957. ⟨10.1016/j.comnet.2022.108957⟩. ⟨hal-03648574v2⟩
70 Consultations
241 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More