-
Notifications
You must be signed in to change notification settings - Fork 4
/
Copy pathabstract.tex
11 lines (8 loc) · 1.36 KB
/
abstract.tex
1
2
3
4
5
6
7
8
9
10
11
\clearpage
\null
\vfil
\thispagestyle{plain}
\begin{center}\textbf{Abstract}\end{center}
Learning underlying network dynamics from packet-level data has been deemed an extremely difficult task, to the point that it is practically not attempted. While research has shown that machine learning (ML) models can be used to learn behaviour and improve on some specific tasks in the networking domain, these models do not generalize to any other tasks. However, a new ML model called the \emph{Transformer} has shown massive generalization abilities in several fields, where the model is pre-trained on large datasets, in a task agnostic fashion and fine-tuned on smaller datasets for task specific applications, and has become the state-of-the-art architecture for machine learning generalization. We present a new Transformer architecture adapted for the networking domain, the Network Traffic Transformer (NTT), which is designed to learn network dynamics from packet traces. We pre-train our NTT to learn fundamental network dynamics and then, leverage this learnt behaviour to fine-tune to specific network applications in a quick and efficient manner. By learning such dynamics in the network, the NTT can then be used to make more network-aware decisions across applications, make improvements to the same and make the networks of tomorrow, more efficient and reliable.
\vfil
\clearpage