This is my final project for CPSC 8510 Software Systems for Data Communication, taught by Prof. James Martin. In this project, we simulate an adaptive video streaming application using NS-3.

What is NS-3

NS-3 is a free and open-source discrete-event network simulator for internet systems, targeted primarily for research and educational use. You can check out the offical documentation and user guide here. We pick ns-3 version 3.30 for the project, which can be downloaded through this link. The build and installation of ns-3 can be accomplished following the official tutorial here.

What is Adaptive Video Streaming

Streaming is a technology used to deliver content from the server to clients over the internet without having to download it. Multimedia streaming is one of the most popular and successful streaming services since it allows the user to watch the video or listen to music almost immediately without having to wait for the file to be completely downloaded.

Video streaming typically requires a relatively fast internet connection. For services like Hulu, YouTube, and Netflix, 2-3 Mbps are required for SD, 5-8 Mbps for HD, and 12-25 Mbps for UHD. Live streaming uses the same techniques, but it is specifically designed for real-time internet content delivery.

Traditional progressive video streaming is simply one single video file being streamed over the internet, and the video can be stretched or shrunk to fit different screen resolutions. Regardless of the device playing it, the video file will always be the same.

Adaptive streaming (also known as Adaptive Bitrate Streaming), instead, is a technique designed to deliver the multimedia contents to the user in the most efficient way and in the highest possible quality for each user. Specifically, adaptive streaming needs the video streaming server to create a different video file for each target screen size, and it will lower the video quality for the device with slow internet connection.

Problem

The project is to design a video streaming application with adaptive rate controller on top of UDP protocol. It is easy for a streaming service to meet either one of the objectives on its own. To maximize video quality, a service could just stream at the maximum video rate all the time. Of course, this would risk extensive rebuffering. On the other hand, to minimize rebuffering, the service could just stream at the minimum video rate all the time, which would lead to low video quality. The approach in our project is to dynamically change the video rate based on the link speed and frame buffer of the client, and thus ensure the best viewing experience for the user.

Methods and Results

For detailed explanation of the methods used in the project and the results we obtained, please refer to the final report in the Resources section below.

Issues

  • We noticed that the mobile devices will lose connection to the server when it moves out of the range of wireless signals. However, we did not observe the transmission rate dropping when the mobile devices are away from the sever, which requires further investigation on how different modes work in the ns-3 wireless simulator.
  • Due to the time limitation, we did not use the real video files for the transmission. It is also a good improvement to add some video decoders to the application.

Resources