A New Hope for Network Model Generalization
Loading...
Author / Producer
Date
2022-11
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
Data
Rights / License
Abstract
Generalizing machine learning (ML) models for network traffic dynamics tends to be considered a lost cause. Hence for every new task, we design new models and train them on model-specific datasets closely mimicking the deployment environments. Yet, an ML architecture called Transformer has enabled previously unimaginable generalization in other domains. Nowadays, one can download a model pre-trained on massive datasets and only fine-tune it for a specific task and context with comparatively little time and data. These fine-tuned models are now state-of-the-art for many benchmarks.
We believe this progress could translate to networking and propose a Network Traffic Transformer (NTT), a transformer adapted to learn network dynamics from packet traces. Our initial results are promising: NTT seems able to generalize to new prediction tasks and environments. This study suggests there is still hope for generalization through future research.
Permanent link
Publication status
published
External links
Editor
Book title
HotNets '22: Proceedings of the 21st ACM Workshop on Hot Topics in Networks
Journal / series
Volume
Pages / Article No.
152 - 159
Publisher
Association for Computing Machinery
Event
21st ACM Workshop on Hot Topics in Networks (HotNets 2022)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Transformer; Packet-level modeling
Organisational unit
09477 - Vanbever, Laurent / Vanbever, Laurent
Notes
Conference lecture held on November 14, 2022
Funding
ETH-03 19-2 - Dependable and Data-Driven Intelligent Networks (ETHZ)
Related publications and datasets
Is supplemented by: https://doi.org/10.3929/ethz-b-000586827