
Open access
Author
Date
2022-08Type
- Master Thesis
ETH Bibliography
yes
Altmetrics
Abstract
Learning underlying network dynamics from packet-level data has been deemed an extremely difficult task, to the point that it is practically not attempted. While research has shown that machine learning (ML) models can be used to learn behaviour and improve on some specific tasks in the networking domain, these models do not generalize to any other tasks. However, a new ML model called the Transformer has shown massive generalization abilities in several fields, where the model is pre-trained on large datasets, in a task agnostic fashion and fine-tuned on smaller datasets for task specific applications, and has become the state-of-the-art architecture for machine learning generalization. We present a new Transformer architecture adapted for the networking domain, the Network Traffic Transformer (NTT), which is designed to learn network dynamics from packet traces. We pre-train our NTT to learn fundamental network dynamics and then, leverage this learnt behaviour to fine-tune to specific network applications in a quick and efficient manner. By learning such dynamics in the network, the NTT can then be used to make more network-aware decisions across applications, make improvements to the same and make the networks of tomorrow, more efficient and reliable. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000569234Publication status
publishedPublisher
ETH Zurich, Networked SystemsOrganisational unit
09477 - Vanbever, Laurent / Vanbever, Laurent
Related publications and datasets
Is cited by: https://doi.org/10.3929/ethz-b-000577569
More
Show all metadata
ETH Bibliography
yes
Altmetrics