Click here to Skip to main content
65,938 articles
CodeProject is changing. Read more.
Articles / desktop / electron

Stream 360 Video to VR Headset from Drones with RICOH THETA Camera

5.00/5 (4 votes)
31 May 2019CPOL3 min read 11.6K  
Live stream 360 video from a flying drone to a VR headset such as Oculus Rift or HTC Vive

Image 1

360 Video Streams and Drones

Drones are in wide use for industrial surveillance as well as recreation, but do not provide an immersive experience in VR headsets such as Oculus Rift or HTC Vive. 

The source of most problems with an immersive experience stems from the use of multiple cameras that can cover the entire panoramic view of the drone, but do not accurately track the scene when a person turns their head in a VR headset.

To solve this problem, Jake Kenin used a RICOH THETA V to live stream the video feed to a ground station and VR headset.

Image 2

Tech

Video Tech

Jake's solution uses MotionJPEG. This is a compromise. Using MotionJPEG resulted in a lower resolution and lower framerate than other technologies. Although the camera itself can stream at 3840x1920, 30fps, Jake's current solution has the performance listed below at a distance of 0.25 miles:

  • ~250 ms delay at 1920x960 @ 8fps
  • ~100 ms delay at 1024x512 @ 30 fps

The delay refers to the time taken for the image to be shown in the headset. For example, if you move the drone with your controller, you will not see the change in movement for about 1/5 of a second. You want the delay to be as low as possible. If the delay is too long, it will be too difficult to pilot the drone.

When you view a 360 video in a headset, you can't see the entire video. About 2/3 of the video is hidden. You turn your head to see the hidden parts of the video. A video of 1024 pixels is going to look like a 400 pixel video. 

fps refers to frames per second, often referred to as framerate. The higher the framerate, the smoother the video. At 8fps, the video will be noticeably jerky.

Jake tested other transmission technologies such as RTMP and RTSP instead of MotionJPEG. Although the resolution and framerate were much better, the delay was greater than 1 second, making the drone difficult to pilot at high speed.

Due to the limitations of MotionJPEG, Jake plans to adapt his project to 4K video using a high compression video standard.

Viewer Tech

To display the video stream into an Oculus Rift headset, Jake used Electron with an NPM app to provide the OpenVR bindings.

Jake also uses A-Frame for VR functionality. 

Image 3

Mounting Tech

To stabilize the video stream, Jake used a gimbal.

Image 4

Getting the Code and Build Details

Extensive details on the build are available in a series of forum posts on community.theta360.guide

Viewer code for the Amelia Viewer that Jake wrote is available on GitHub.

Image 5

The parts for this project are listed below:

Image 6

As MotionJPEG is a stream, you will need to read the byte stream and display it. Each frame of the video is a JPEG image. This is the most difficult part of dealing with a live video stream for many developers. There are many JavaScript examples to read in the bytestream. This is how Jake did it.  

JavaScript
const read = () => {

    reader.read().then(({done, value}) => {
        if (done) {
            return;
        }

        for (let index =0; index < value.length; index++) {

            // Start of the frame, everything we've till now was header
            if (value[index] === SOI[0] && value[index+1] === SOI[1]) {
                contentLength = getLength(headers);
                imageBuffer = new Uint8Array(
                    new ArrayBuffer(contentLength));
            }
            // we're still reading the header.
            if (contentLength <= 0) {
                headers += String.fromCharCode(value[index]);
            }
            // we're now reading the jpeg.
            else if (bytesRead < contentLength){
                imageBuffer[bytesRead++] = value[index];
                bytesThisSecond++;
            }
            // we're done reading the jpeg. Time to render it.
            else {
                //console.log("jpeg read with bytes : " + bytesRead);

                // Generate blob of the image and emit event
                lastFrameImgUrl = URL.createObjectURL(
                    new Blob([imageBuffer], {type: TYPE_JPEG}));
                var reRenderEvent = new CustomEvent(RERENDER_EVENT,
                    { detail: lastFrameImgUrl });
                document.dispatchEvent(
                    reRenderEvent);

                // Reset for the frame
                frames++;
                contentLength = 0;
                bytesRead = 0;
                headers = '';
            }
        }

The full code is in this file of his GitHub repo. 

Community Recognition

Jake solved a problem that the community had been working on for 3 years. He won the project of the month award for his success.

Variation of the Flying Drone

Jake's concept is similar to the Fox Sewer Rover by Hugues Perret.

Image 7

The transmission code for FOX SEWER ROVER project is available here.

Testing on a Raspberry Pi

Image 8

The code from Hugues is written in Python and will work on a Raspberry Pi. If you want to experiment with a drone on a car, this is a good option.

Join the Pioneer Discussion

There's still a lot to do and explore. This area is really in a nascent state. It's possible for hobbyists to make breakthroughs and contributions. If you have questions, comments or contributions, feel free to drop them into the dev community forum for the RICOH THETA 360 camera we use as a platform..

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)