Introduction
The example bellow demonstrates how to implement a service running on Raspberry Pi which captures video from the camera and streams it to a .NET client which processes it and displays on the screen.
To implement this scenario following topics need to be addressed by the implementation:
- Capturing video from the Raspberry Pi camera by the service application.
- Streaming video across the network.
- Processing and displaying video by the .NET client application.
Capturing Video from Raspberry Pi Camera
Raspberry Pi Camera is a high definition camera producing video data in raw H 264 format.
To control and capture the video Raspberry provides the console application raspivid
which can be executed with various parameters specifying how the video shall be captured. E.g. you can specify parameters like width, height or frames per seconds as well as if the video shall be produced to a file or to stdout (standard output).
The service application implemented in this example internally uses the raspivid application. To capture the video the service starts raspivid and then reads incoming video data from raspivid's stdout.
The service executes raspivid with following parameters (which are suitable for live streaming):
raspivid -n -vf -hf -ih -w 320 -h 240 -fps 24 -t 0 -o -
-n | No preview. |
-vf -hf | Flip video vertically and horizontally. |
-ih | Insert SPS and PPS inline headers to the video stream (so that if a second client connects to an ongoing streaming it can synchronize to image frames). |
-w 320 -h 240 | Produce 320 x 240 pixels video. |
-fps 24 | Produce video with 24 frames per second. |
-t 0 | Capture video for infinite time. |
-o - | Produce video to standard output (so that it can be captured by the service application). |
There are other parameters you can try to play with. To list all of them you can run:
raspivid -h
Streaming Video Across Network
Live video data continuously coming from raspivid stdout needs to be transferred across the network to the connected client.
To transfer data the implementation uses Eneter Messaging Framework the lightweight cross-platform library for the interprocess communication.
To avoid serialization/deserialization the communication is based directly on duplex channels. It means the service application running on Raspberry Pi uses duplex input channel and the .NET client running on PC uses duplex output channel.
Then when a chunk of video data is read from raspivid stdout the service uses the duplex input channel to send data to connected clients. The .NET client then uses duplex output channel to receive the video data and to notify it for further processing (e.g. displaying).
Processing and Displaying Video by .NET Client
Although H 264 is a very common encoding it is not a trivial task to play the live video encoded in this codec.
The major problem is Media Element (UI control from WPF) does not support playing video from a memory stream or just from an array of bytes. It expects a path to a local file or URL.
I found some hack proposing to register own protocol so that when the protocol name appears in URI the registered library will get the call but I did not want to go for such solution.
Another problem is that if you have Windows Vista (or Windows XP) you would have to install codecs for H 264 (and when I did so I still did not manage to play raw H 264 bytes stored in the file).
Another alternative is to use VLC library from Video Lan. And although VLC does not support playing from the memory stream or from a byte array it supports playing video from the named pipe. It means if the video source is specified like
stream://\\\.\pipe\MyPipeName
then VLC will try to open MyPipeName and play the video.
But also this is not out of the box solution.
The major problem is the VLC library exports just pure C methods. And so it does not contain a WPF based UI control you can just drag and drop into your UI.
There are several wrappers implementing UI controls based on VLC (e.g. very promising solution I tested is nVLC implemented by Roman Ginzburg) but these implementations look quite complex and some of them are buggy.
I like the approach described by Richard Starkey (part 1 and part 2) to provide just a thin wrapper and to use the VLC functionality directly. The advantage is the solution is lightweight and give you the full flexibility. And as you will see it is really not difficult to use.
So I have slightly reworked Richard's original code and used it for the implementation of the .NET client. The implementation of the whole wrapper can be found in the <span style="font-family: Courier New, Courier, monospace;">VLC.cs</span>
file.
To Run Example
Download
- Download and unzip this example.
- Download 'Eneter for .NET' and 'Eneter for Java' from http://www.eneter.net/ProductDownload.htm.
- Download and install VLC media player from https://www.videolan.org/. (VLC libraries will be used by .NET application to play the video stream)
Raspberry Pi service application
- Open Java project raspberry-camera-service in Eclipse and add reference to eneter-messaging.jar which you downloaded.
(Right click on the project -> Properties -> Java Build Path -> Libraries -> Add External Jars -> eneter-messaging-6.0.1.jar) - Build the project and then export it to executable jar.
(Right click on the project -> Export... -> Java -> Runable JAR file -> Launch configuration -> Export Destination -> Package required libraries into generated JAR -> Finish.) - Copy the generated jar to the Raspberry device.
- Start the application
java -jar raspberry-camera-service.jar
.NET Client Application
- Open RaspberryCameraClient solution and add reference to Eneter.Messaging.Framework.dll which you downloaded.
- Check if the path VLC is correct in MainWindow.xaml.cs.
- Provide correct IP address to your Raspberry Pi service in MainWindow.xaml.cs.
- Compile and run.
- Press 'Start Capturing'.
Raspberry Pi Service Application
Raspberry Pi Service is a simple console application implemented in Java. It listens for clients. When the first client connects it starts the raspivid application to start the video capturing. Then it consumes the stdout from raspivid and forwards video data to the connected client.
When the second client connects it just starts to forward ongoing stream to the second client too. The client will be able to synchronize with the vide stream because it contains SPS and PPS inline headers.
The code is very simple:
package eneter.camera.service;
import java.io.InputStream;
import java.util.HashSet;
import eneter.messaging.diagnostic.EneterTrace;
import eneter.messaging.messagingsystems.messagingsystembase.*;
import eneter.messaging.messagingsystems.tcpmessagingsystem.TcpMessagingSystemFactory;
import eneter.messaging.messagingsystems.udpmessagingsystem.UdpMessagingSystemFactory;
import eneter.net.system.EventHandler;
class CameraService
{
private IDuplexInputChannel myVideoChannel;
private Process myRaspiVidProcess;
private InputStream myVideoStream;
private HashSet<String> myConnectedClients = new HashSet<String>();
private boolean myClientsUpdatedFlag;
private Object myConnectionLock = new Object();
public void startService(String ipAddress, int port) throws Exception
{
try
{
IMessagingSystemFactory aMessaging = new TcpMessagingSystemFactory();
myVideoChannel = aMessaging.createDuplexInputChannel(
"tcp://" + ipAddress + ":" + port + "/");
myVideoChannel.responseReceiverConnected().subscribe(myClientConnected);
myVideoChannel.responseReceiverDisconnected().subscribe(myClientDisconnected);
myVideoChannel.startListening();
}
catch (Exception err)
{
stopService();
throw err;
}
}
public void stopService()
{
if (myVideoChannel != null)
{
myVideoChannel.stopListening();
}
}
private void onClientConnected(Object sender, ResponseReceiverEventArgs e)
{
EneterTrace.info("Client connected.");
try
{
synchronized (myConnectionLock)
{
myConnectedClients.add(e.getResponseReceiverId());
myClientsUpdatedFlag = true;
if (myRaspiVidProcess == null)
{
String aToExecute = "raspivid -n -vf -hf -ih -w 320 -h 240 -fps 24 -t 0 -o -";
myRaspiVidProcess = Runtime.getRuntime().exec(aToExecute);
myVideoStream = myRaspiVidProcess.getInputStream();
Thread aRecordingThread = new Thread(myCaptureWorker);
aRecordingThread.start();
}
}
}
catch (Exception err)
{
String anErrorMessage = "Failed to start video capturing.";
EneterTrace.error(anErrorMessage, err);
return;
}
}
private void onClientDisconnected(Object sender, ResponseReceiverEventArgs e)
{
EneterTrace.info("Client disconnected.");
synchronized (myConnectionLock)
{
myConnectedClients.remove(e.getResponseReceiverId());
myClientsUpdatedFlag = true;
if (myConnectedClients.isEmpty() && myRaspiVidProcess != null)
{
myRaspiVidProcess.destroy();
myRaspiVidProcess = null;
}
}
}
private void doCaptureVideo()
{
try
{
String[] aClients = {};
byte[] aVideoData = new byte[4096];
while (myVideoStream.read(aVideoData) != -1)
{
if (myClientsUpdatedFlag)
{
aClients = new String[myConnectedClients.size()];
synchronized (myConnectionLock)
{
myConnectedClients.toArray(aClients);
myClientsUpdatedFlag = false;
}
}
for (String aClient : aClients)
{
try
{
myVideoChannel.sendResponseMessage(aClient, aVideoData);
}
catch (Exception err)
{
}
}
}
}
catch (Exception err)
{
}
EneterTrace.info("Capturing thread ended.");
}
private EventHandler<ResponseReceiverEventArgs> myClientConnected
= new EventHandler<ResponseReceiverEventArgs>()
{
@Override
public void onEvent(Object sender, ResponseReceiverEventArgs e)
{
onClientConnected(sender, e);
}
};
private EventHandler<ResponseReceiverEventArgs> myClientDisconnected
= new EventHandler<ResponseReceiverEventArgs>()
{
@Override
public void onEvent(Object sender, ResponseReceiverEventArgs e)
{
onClientDisconnected(sender, e);
}
};
private Runnable myCaptureWorker = new Runnable()
{
@Override
public void run()
{
doCaptureVideo();
}
};
}
.NET Client Application
.NET client is a simple WPF based application. When a user clicks on 'Start Capturing' it creates the named pipe and sets VLC to use this named pipe as its video source. It also sets VLC to expect raw H 264 encoded video data. Then using Eneter it opens the connection with the Raspberry Pi service. When video data is received it writes received data to the named pipe so that VLC can process and display it.
Please do not forget to provide correct IP address to your Raspberry Pi and check if you do not need to update the path to VLC!
The code is very simple:
using System;
using System.IO.Pipes;
using System.Threading;
using System.Windows;
using Eneter.Messaging.MessagingSystems.MessagingSystemBase;
using Eneter.Messaging.MessagingSystems.TcpMessagingSystem;
using Eneter.Messaging.MessagingSystems.UdpMessagingSystem;
using VLC;
namespace RaspberryCameraClient
{
public partial class MainWindow : Window
{
private IDuplexOutputChannel myVideoChannel;
private NamedPipeServerStream myVideoPipe;
private VlcInstance myVlcInstance;
private VlcMediaPlayer myPlayer;
public MainWindow()
{
InitializeComponent();
System.Windows.Forms.Panel aVideoPanel = new System.Windows.Forms.Panel();
aVideoPanel.BackColor = System.Drawing.Color.Black;
VideoWindow.Child = aVideoPanel;
myVlcInstance = new VlcInstance(@"c:\Program Files\VideoLAN\VLC\");
myVideoChannel = new TcpMessagingSystemFactory()
.CreateDuplexOutputChannel("tcp://192.168.1.17:8093/");
myVideoChannel.ResponseMessageReceived += OnResponseMessageReceived;
}
private void Window_Closed(object sender, EventArgs e)
{
StopCapturing();
}
private void OnStartCapturingButtonClick(object sender, RoutedEventArgs e)
{
StartCapturing();
}
private void OnStopCapturingButtonClick(object sender, RoutedEventArgs e)
{
StopCapturing();
}
private void StartCapturing()
{
string aVideoPipeName = Guid.NewGuid().ToString();
myVideoPipe = new NamedPipeServerStream(@"\" + aVideoPipeName,
PipeDirection.Out, 1,
PipeTransmissionMode.Byte,
PipeOptions.Asynchronous, 0, 32764);
ManualResetEvent aVlcConnectedPipe = new ManualResetEvent(false);
ThreadPool.QueueUserWorkItem(x =>
{
myVideoPipe.WaitForConnection();
aVlcConnectedPipe.Set();
});
using (VlcMedia aMedia = new VlcMedia(myVlcInstance,
@"stream://\\\.\pipe\" + aVideoPipeName))
{
aMedia.AddOption(":demux=H264");
myPlayer = new VlcMediaPlayer(aMedia);
myPlayer.Drawable = VideoWindow.Child.Handle;
myPlayer.Play();
}
if (!aVlcConnectedPipe.WaitOne(5000))
{
throw new TimeoutException("VLC did not open connection with the pipe.");
}
myVideoChannel.OpenConnection();
}
private void StopCapturing()
{
myVideoChannel.CloseConnection();
if (myVideoPipe != null)
{
myVideoPipe.Close();
myVideoPipe = null;
}
if (myPlayer != null)
{
myPlayer.Dispose();
myPlayer = null;
}
}
private void OnResponseMessageReceived(object sender, DuplexChannelMessageEventArgs e)
{
byte[] aVideoData = (byte[])e.Message;
myVideoPipe.Write(aVideoData, 0, aVideoData.Length);
}
}
}