Broadcasting of a Video Stream from an IP-camera using WebRTC
Technically, online broadcasting from an IP-camera doesn’t require WebRTC. The camera is a server itself capable of connecting to a router and transmitting video content online. So, why do we need WebRTC in the first hand?
There are at least two reasons for that:
- As the number of spectators watching the Ethernet broadcast grows, they will experience gradually increasing insufficiency of a bandwidth and then resources of the camera if the audience keeps growing.
- As mentioned above, an IP-camera is a server. But what protocols does it use to transmit video to a browser or a mobile device? Most likely, a camera uses HTTP streaming where video frames or JPEG images are transmitted via HTTP. However, HTTP streaming doesn’t exactly suit for real time video streams. It does perform well for video on-demand however, where interactivity and latency are not crucial. Indeed, if you watch a movie, it doesn’t matter if you receive it 5-10 seconds later. Well, unless you are watching it with someone else simultaneously. “Oh, no! It was Jack who has murdered her!” Alice writes in a chat to Bob 10 seconds before he sees the tragic outcome.
The other option is RTSP/RTP plus H.264, but in this case a browser should have a video player plugin installed, such as VLC or QuickTime. Such plugin will receive and play the video just like a player. But we needed real browser-based streaming, without any additional bells and whistles.
At first, let’s sniff our IP-camera to learn what exactly it sends to a browser. As a subject for tests, we used D-Link DCS 7010L:
You can read more on installing and configuring of this camera below, while now we simply check how it handles video streaming. As we log on to camera’s admin panel, we can see the following picture in the web-interface of the camera (sorry for the landscape):
The image opens in all browsers and shows nearly the same once per second lags. Assuming both the camera and the laptop are connected to the same router we should expect smooth and fluent playback, but that is not so. Seems HTTP is the reason. Lets run Wireshark to confirm our speculations:
Here we can see a sequence of TCP fragments 1514 bytes length
and a finishing HTTP 200 OK response containing the length of the accepted JPEG:
Then, we open Chrome / Developer Tools / Network and see real-time GET requests and images transferred through HTTP:
We don’t want such a streaming. It’s not fluent; it jerks HTTP requests back and forth. What is the amount of such requests the camera can handle? We suppose with 10 or more spectators the camera will either knock off or become too buggy and sluggish.
Taking a look into HTML source of the admin panel of the camera, we can discover the following code:
if(browser_IE)
DW('<OBJECT CLASSID="CLSID:'+AxUuid+'" CODEBASE="/VDControl.CAB?'+AxVer+'#version='+AxVer+'" width="0" height="0" ></OBJECT>');
else
{
if(mpMode == 1)
var RTSPName = g_RTSPName1;
else if(mpMode == 2)
var RTSPName = g_RTSPName2;
else if(mpMode == 3)
var RTSPName = g_RTSPName3;
var o='';
if(g_isIPv6)
var host = g_netip;
else
var host = g_host;
o+='<object id="qtrtsp_object" width="0" height="0" codebase="http://www.apple.com/qtactivex/qtplugin.cab" ';
o+='classid="clsid:02BF25D5-8C17-4B23-BC80-D3488ABDDC6B" type="video/quicktime">';
o+='<param name="src" value="http://'+host+":"+g_Port+'/qt.mov" />';
o+='<param name="autoplay" value="true" />';
o+='<param name="controller" value="false" />';
o+='<param name="qtsrc" value="rtsp://'+host+':'+g_RTSPPort+'/'+RTSPName+'"/>';
o+='</object>';
DW(o);
}
RTSP/RTP — is what we need to play the video smoothly. But will this work in a browser? No. Or it will with the installed QuickTime plugin, but we want pure browser streaming.
Another option worth mentioning here is Flash Player, which also can receive an RTMP stream converted from RTSP, RTP, H.264 via Wowza. But Flash Player is a browser plugin too, although much more popular than VLC or QuickTime.
In our case we test the same RTSP/RTP re-streaming, but as a player we use a WebRTC-compatible browser without any additional plugins or other gadgets. We setup a retranslation server that will fetch a video stream from the IP-camera and broadcasts it via Internet to an arbitrary number of users watching it in a WebRTC browser.
Connecting the IP-camera
As mentioned above, we selected a very simple D-Link DCS-7010L IP-camera. The key factor of the selection was the support for RTSP, since it is with this protocol the server should fetch the stream from the cam.
We connected the camera to a router with a supplied patch cord. The camera was on; it identified the router and obtained the IP address using DHCP. In our case it was 192.168.1.34 (If you open router settings, you can see a connected device - DCS 7010L. That’s it). Now it is time to test the cam.
Open the specified IP-address in a browser, 192.168.1.34, to enter the admin web-interface of the camera. The password is empty by default.
As you can see, the video from the camera plays well in the admin panel. Though, we did see periodical jerks. But that is what we are to fix using WebRTC.
Configuring the camera
For the starter, we disabled authentication. For testing purposes we allow everyone to see the broadcast. To do this, enter Setup – Network and set Authentication to Disable.
In the same section, we check if the RTSP uses a correct port. By default, the port is 554. The format of the output video is specified in the profile. You can configure up to 3 profiles, but we will use the first one – live1.sdp – as it is already configured to H.264 for video and G.711 for audio. Any settings can be changed later via Setup – Audio and Video.
Now we can test cam operation through RTSP. Open VLC Player (or any other supporting RTSP — QuickTime, Windows Media Player, RealPlayer and others) and in the Open URL dialog specify the RTSP address of the camera. In our case it was: rtsp://192.168.1.34/live1.sdp
Ok, it works as it should. The cam sends a video stream using RTSP to a player.
By the way, the stream plays fluently, completely without artifacts. We can expect the same from WebRTC.
Installing the server
So, the camera is installed, tested using desktop players and is ready to broadcast via a server. Using whatismyip.com we determined the external IP-address of the cam. This was 178.51.142.223. Now we need to tell the router to redirect all RTSP requests sent via the port 554 to an IP-camera..
So we type the corresponding settings into the router...
…and check the external IP-address and RTSP port using telnet:
telnet 178.51.142.223 554
After making sure the port does answer, we begin installing the WebRTC server.
The hosting service was provided by Amazon EC2 Centos 64 bit server. To reduce chances of performance issues, we selected m3.medium instance with one VCPU:
Surely, there are also Linode and DigitalOcean, but we decided to go Amazon this time. Jumping ahead of myself, Amazon EC2 control panel needs certain port configuration required for this example to work. These are ports required by WebRTC traffic (SRTP, RTCP, ICE) and ports for RTSP/RTP traffic. If you decide to test it too, make sure Amazon incoming traffic panel looks something like this:
DigitalOcean is even simpler here, you should merely close these ports in the firewall, or mute it completely. According to our experience of using DO instances, they still provide a static IP and do not mess with any NATs whatsoever, so playing around with ports like we did on Amazon is not necessary here.
As server software to broadcast RTSP/RTP stream to WebRTC we used WebRTC Media & Broadcasting Server by Flashphoner. The stream server looks very similar to Wowza, which can broadcast RTSP/RTP via Flash. The only difference is that the stream is actually transmitted via WebRTC, not Flash. Technically this means a browser and a server communicate using DTLS, establish an SRTP session and transfer a VP8-encoded stream to a spectator.
Installation requires SSH-access.
Spoiler: the complete list of executed commands
1. Download the setup package of the server:
$wget flashphoner.com/downloads/builds/WCS/3.0/x8664/wcs3_video_vp8/FlashphonerMediaServerWebRTC-3.0/FlashphonerMediaServerWebRTC-3.0.868.tar.gz
2. Extract:
$tar -xzf FlashphonerMediaServerWebRTC-3.0.868.tar.gz
3. Install:
$cd FlashphonerMediaServerWebRTC-3.0.868
$./install.sh
Configure the external IP-address of the server: 54.186.112.111 and the Private IP: 172.31.20.65.
4. Start the server:
$service webcallserver start
5. Check logs:
$tail — f /usr/local/FlashphonerWebCallServer/logs/server_logs/flashphoner.log
6. Make sure the server is running:
$ps aux | grep Flashphoner
7. Install and start apache:
$yum install httpd
$service httpd start
8. Download web-files and place them into the default folder of apache /var/www/html
cd /var/www/html
$wget github.com/flashphoner/flashphoner_client/archive/wcs_media_client.zip
$unzip webrtc_media_client.zip
9. Enter IP-address of the server in to the config flashphoner.xml:
10. Stop the firewall.
$service iptables stop
In theory, you should configure ports and rules in the firewall instead on step 10, but for testing purposes we just turned it off.
Configuring the server
The structure of our WebRTC broadcasting looks as follows:
We have already configured the basic elements of this chart. Now we need to configure “arrows”.
The web-client is in charge for interconnection between a browser and the WebRTC server. It is available for download at github:. JS, CSS and HTML files of the client are uploaded to /var/www/html during installation (see above step 9).
Browser-server communication is configured in the XML file flashphoner.xml. We should write the IP-address of the server there, so that the web-client could connect to the WebRTC server via HTML5 Websockets (see step 9).
Ok, we are done with server configuration, let’s test it:
Open the web-client index page index.html in a browser (we need apache to be installed on our Amazon server for this: yum -y install httpd):
http://54.186.112.111/wcs_media_client/?id=rtsp://webrtc-ipcam.ddns.net/live1.sdp
webrtc-ipcam.ddns.net — is a free domain obtained via dynamic DNS noip.com, that simply links to our external IP-address. Also, we told the router to redirect RTSP requests sent to 192.168.1.34 in accordance to NAT addresses translation rules (see above).
The parameter id=rtsp://webrtc-ipcam.ddns.net/live1.sdp зsets the URL of the stream to play. The WebRTC server fetches streams from the IP-cam, processes them and broadcasts to a browser using WebRTC. It may be that your router does support DDNS. If no, you can use cam’s option:
Here is how DDNS support looks in the router:
Now it is time to test the system and see the results.
Testing
After the link is opened in a browser, it connects to the WebRTC server. The server sends a request to the IP-camera to fetch the video stream. The process takes few seconds.
A browser connects to the server through websockets, then the server queries the cam via RTSP, obtains H.264 stream via RTP, transcodes it to VP8 / SRTP format, which is finally played by the WebRTC-compatible browser.
After a tiny delay, we can see the familiar picture.
The bottom part of the video displays the URL of the video stream. You can copy it to open in another browser or in a new tab.
Making sure it is WebRTC indeed
What if we were cheated and the video from the IP-camera still goes through HTTP? Let us don’t trust the pure image, but check what kind of traffic we actually receive. Run Wireshark and Chrome debug console again. In the Chrome console we can see the following:
This time nothing is coming in and out, and there are no images transferred via HTTP. All we see now is Websocket frames, most of them are the ping/pong type to keep Websocket session alive. Interesting frames here are: connect, prepareRtspSession and onReadyToPlay — connection to the server goes through these exact stages: first Websocket connection, then the playback request.
And here is what chrome://webrtc-internals shows:
According to diagrams, we have a bitrate from IP-cam of 1Mbps. There is also outgoing traffic, most likely these are RTCP and ICE packets. RTT to the Amazon server is about 300 milliseconds.
Now we take a look to Wireshark. It clearly displays UDP traffic from the IP-address of the server. Packets on the picture below are 1468 bytes. This is WebRTC. In particular, SRTP packets transferring VP8 video frames we can see in a browser. Besides, we can see some STUN requests (the undermost packet on the picture_ - this is WebRTC ICE carefully checking connection.
It is worth to mention that video playback featured comparably low latency (ping to the data-center was about 260 ms). WebRTC works thru SRTP/UDP, and this is the fastest way to deliver packets all around, comparing with HTTP, RTMP and other TCP-like streaming methods. Therefore, visible latency should be RTT + buffering time, decoding time and playback delay.
Visually, the naked eye cannot see any latency, which means it is below 500 milliseconds.
The next test is to connect other spectators. We opened 10 Chrome windows each one displaying the picture. This led to Chrome become sluggish itself. In 11th window opened on another computer the playback remained fluent.
WebRTC on mobile devices
As you know, Chrome and Firefox browser on the Android platform support WebRTC as well. Let’s see if our broadcasting works there:
An HTC smartphone displays video from the IP-camera in Firefox. There were no any differences in smoothness of playback comparing with the desktop.
Conclusion
As a result, we managed to run a WebRTC online broadcast from an IP-camera to several browsers with minimum efforts. Everything worked without rain dancing or voodoo, nor did we need rocket science – only basic Linux and SSH knowledge was required.
The quality of broadcasting was absolutely acceptable, and latency isn’t visible with the naked eye.
We can conclude that browser-based WebRTC broadcasts certainly deserve consideration, as in our case WebRTC is not a supplemental add-on or plugin, but a real platform to play video in a browser.
Why don’t we see WebRTC commonly used then?
The main obstacle is, perhaps, the lack of codecs. WebRTC community and vendors should make efforts to embed H.264 codec into WebRTC. We can’t say nothing bad about VP8, but why ignore millions of compatible devices and software already working with H.264? Those damn patents…
The second reason is partial support by browsers. The question is still open in IE and Safari, so we have to use other ways to stream or use plugins like webrtc4all.
In the future we hope to see many interesting solutions that do not require transcoding, and many browsers able to play streams from various devices directly.