OptCDU: Optimizing the Computing Data Unit Size for COIN

Publikation: Beitrag in FachzeitschriftForschungsartikelBeigetragenBegutachtung

Beitragende

Abstract

COmputing In the Network (COIN) has the potential to reduce the data traffic and thus the end-to-end latencies for data-rich services. Existing COIN studies have neglected the impact of the size of the data unit that the network nodes compute on. However, similar to the impact of the protocol data unit (packet) size in conventional store-and-forward packet-switching networks, the Computing Data Unit (CDU) size is an elementary parameter that strongly influences the COIN dynamics. We model the end-to-end service time consisting of the network transport delays (for data transmission and link propagation), the loading delays of the data into the computing units, and the computing delays in the network nodes. We derive the optimal CDU size that minimizes the end-to-end service time with gradient descent. We evaluate the impact of the CDU sizing on the amount of data transmitted over the network links and the end-to-end service time for computing the convolutional neural network (CNN) based Yoho and a Deep Neural Network (DNN) based Multi-Layer Perceptron (MLP). We distribute the Yoho and MLP neural modules over up to five network nodes. Our emulation evaluations indicate that COIN strongly reduces the amount of network traffic after the first few computing nodes. Also, the CDU size optimization has a strong impact on the end-to-end service time; whereby, CDU sizes that are too small or too large can double the service time. Our emulations validate that our gradient descent minimization correctly identifies the optimal CDU size.

Details

OriginalspracheEnglisch
Seitenumfang1
FachzeitschriftIEEE Transactions on Network and Service Management
Jahrgang21
Ausgabenummer6
PublikationsstatusElektronische Veröffentlichung vor Drucklegung - 30 Aug. 2024
Peer-Review-StatusJa

Externe IDs

Scopus 85202736107
ORCID /0000-0001-8469-9573/work/172567549

Schlagworte