On 4 December 2025, Emilia Ndilokelwa Weyulu successfully defended her PhD thesis with the title "Towards a Robust and Reproducible Evaluation Framework for Congestion Control Algorithms". She joined MPI for Informatics and Saarland University as a doctoral candidate in May 2019. The thesis was supervised by Prof. Dr. Anja Feldmann, Scientific Director of the Internet Architecture department, and Prof. Dr. Balakrishan Chandrasekaran, Assistant Professor at Vrije Universiteit Amsterdam. The doctoral degree is awarded by Saarland University.
The abstract of the thesis:
Network congestion, the state where systems such as switches or routers receive more data than they can handle, leads to packet losses, increased network delay, and reduced throughput for all data passing through such a congested system. Congestion control remains a key research problem in networking, with both industry and academia proposing solutions to improve network performance. Congestion control algorithms (CCAs) are designed to address network congestion, ensure fair resource allocation among users (or applications), and maintain good network performance. Despite the innovation and research effort that has been invested into designing new CCAs that cater to diverse network data, obtaining consensus on how to efficiently evaluate such algorithms has proven elusive within the networking community. Determining how any CCA falls short compared to the rest, and, most importantly, along what dimensions, remains difficult to answer. Even performing all the pairwise comparisons between the algorithms is hard, because each algorithm behaves differently depending on the underlying network environment.
In this thesis, we advocate for a fundamental rethinking of how we approach CCA evaluations. Rather than prescribing a standardized set of tests to be universally applied, we emphasize the importance of aligning evaluations with their underlying objectives. By shifting the focus in this way, the burden on designers to subject their CCAs to an exhaustive list of experiments can be avoided, while simultaneously addressing the reproducibility challenges that currently plague this field. To this end, we developed a rigorous and reproducible "recipe" for evaluating CCAs. With this recipe, we were able to uncover fundamental issues in the design of Google's new CCA, BBRv3--work which was recognized with a "Best Paper" award at PAM'24. Furthermore, this research work has helped to highlight the critical network signals one needs to leverage in the design of network-assisted CCAs.
