The Intel® RealSense™ SDK provides a new feature for developers to record camera streaming sequences to a file on disk for future playback. This feature is very helpful in debugging and troubleshooting camera issues in the apps.
Recording raw color and depth streams into disk files imposes a challenge for disk IO bandwidth on the host system. For example, with color configuration RGB32 1920x1080x30fps and depth configuration 640x480x30fps, the SDK needs about 272MB/s disk I/O bandwidth to write the samples to the disk. This makes most spinning disks and certain slow SSDs not capable for such file recording jobs.
To solve the issue, the SDK provides an experimental feature to compress the samples before writing to the disk. The feature is based on H.264 encoding (I-frame only, constant QP) on color samples and lossless Lempel–Ziv–Oberhumer (LZO) encoding on depth samples. The compression rate is roughly 10:1 for color samples and 2:1 for depth samples. The disk I/O bandwidth is now reduced to about 32MB/s for the previous example. To use this feature, you need use Intel® Iris™ Graphics with the latest Intel Iris Graphics Driver. You can control the recording features with the following registry settings:
Windows Registry Editor Version 5.00
[HKEY_CURRENT_USER\SOFTWARE\Intel\RSSDK\FileRecording]
"DisableH264Compression"=dword:0
"H264_QPI"=dword:8
"DisableLZOCompression"=dword:0
By default, the H.264 compression is enabled on the color stream, and the LZO compression is enabled on all other streams. The H.264 QPI (I-frame quantization parameter) value ranges from 0 (least compression) to 51 (most compression).
Important Note: A recorded file with the H.264 compression can only be played back on systems with Intel Iris Graphics.
In C++, C# or Java code, use SetFileName
from the CaptureManager
instance to set the filename and mode (recording or playback)
C++ pxcStatus SetFileName(pxcCHAR *file, pxcBool record);
C# pxcmStatus SetFileName(String file, Boolean record);
Java pxcmStatus SetFileName(string file, boolean record);
Parameters
file The full path of the file to playback or to be recorded.
record If true, set the recording mode. Otherwise, set the playback mode.
|
Steps to record the streaming sequences to a file
- Use the
SetFileName
function from the CaptureManager
instance.
- Provide a file name and set the recording mode to true. There is no restriction on what the file name can be, except that in the recording mode, the file must be writable.
- Change registry settings if you want to record uncompressed stream
Here is the sample codes to record or playback streaming sequences:
void RecordORPlayback(pxcCHAR *file, bool record) {
PXCSenseManager *sm=PXCSenseManager::CreateInstance();
sm->QueryCaptureManager()->SetFileName(file,record);
sm->EnableStream(PXCCapture.STREAM_TYPE_COLOR,640,480,0);
sm->Init();
for (int i=0;i<300;i++) {
if (sm->AcquireFrame(true)<PXC_STATUS_NO_ERROR) break;
PXCCapture::Sample *sample=sm->QuerySample();
...
sm->ReleaseFrame();
}
sm->Release();
}
During recording, samples are recorded as they are processed by the application to the disk. For example, if the application captures unaligned color and depth samples, the samples on the disk are unaligned. If the application aligns the samples, the samples on the disk are aligned.
The recorded file contains a fixed-size header and the structure is:
struct Header {
pxcI32 ID; pxcI32 fileVersion; pxcI32 firstFrameOffset; pxcI32 nstreams; pxcI64 frameIndexingOffset; PXCSession::CoordinateSystem coordinateSystem; pxcI32 reserved[26];
};
The image frames of any stream are recorded sequentially after the header. It starts from ChunkFrameMetaData
and ends with ChunkFrameData
. In between, there could be multiple configuration frames that describe the metadata of the image frame. The metadata should be interpreted as delta or changes against what are defined in the file header section. This reduces the size if certain metadata are common for all frames. The frame structure is:
struct StreamFrame {
ChunkFrameMetaData frame_header;
ChunkImageMetaData image_meta_data;
ChunkFrameData frame_data;
} frames[]
The ChunkFrameMetaData
structure is:
struct ChunkFrameMetaData {
ChunkId chunkId=CHUNK_FRAME_META_DATA;
pxcI32 chunkSize=sizeof(metaData);
struct {
pxcI32 frameNumber; PXCCapture::StreamType streamType;
pxcI64 timeStamp;
PXCImage::Option options;
} metaData;
};
You can put ChunkImageMetaData
in the frame, and its structure is:
struct ChunkImageMetaData {
ChunkId chunkId=CHUNK_IMAGE_META_DATA;
pxcI32 chunkSize=sizeof(buffer)+sizeof(id);
pxcUID id; pxcBYTE buffer[chunkSize-sizeof(id)];
};
The structure for uncompressed stream is:
struct ChunkFrameDataUncompressed {
ChunkId chunkId=CHUNK_FRAME_DATA;
pxcI32 chunkSize=sizeof(imageData);
struct {
pxcI32 pitches[PXCImage::NUM_OF_PLANES];
pxcBYTE plane0[pitches[0]*height];
...
pxcBYTE planeN[pitches[PXCImage::NUM_OF_PLANES-1]*height];
} imageData
};
And the structure for compressing stream is:
struct ChunkFrameDataCompressed {
ChunkId chunkId=CHUNK_FRAME_DATA;
pxcI32 chunkSize=sizeof(imageData);
struct {
pxcI32 pitches[PXCImage::NUM_OF_PLANES]; enum {
H264=0x343632,
LZO=0x4f5a4c,
} CompressionIdentifier;
pxcBYTE compressed_data[];
} imageData
};
The RealSense™ SDK capture module will add the configuration frame in the file. The configuration frame can contain any arbitrary data identified by the chunk identifier and the chunk size. The order that the chunk data is presented in the file is not important with the general rule that if there are dependencies between two chunks, the dependent chunk should be placed later in the file. The ChunkData
Structure is:
struct ChunkData {
enum ChunkId {
CHUNK_DEVICEINFO = 1,
CHUNK_STREAMINFO = 2,
CHUNK_PROPERTIES = 3,
CHUNK_PROFILES = 4,
CHUNK_SERIALIZEABLE = 5,
CHUNK_FRAME_META_DATA = 6,
CHUNK_FRAME_DATA = 7,
CHUNK_IMAGE_META_DATA = 8,
CHUNK_FRAME_INDEXING = 9,
}chunkId; pxcI32 chunkSize; pxcBYTE chunkData[chunkSize]; } chunks[];
There are some configuration frames that need to be in the files:
CHUNK_DEVICEINFO
is for device info
CHUNK_STREAMINFO
is for stream info
CHUNK_PROPERTIES
for device properties
CHUNK_PROFILES
for stream Profiles
CHUNK_SERIALIZEABLE
for device calibration
CHUNK_FRAME_INDEXING
for frame indexing
Steps to play the streaming sequences file
- Use the
SetFileName
function from the CaptureManager
instance.
- Provide the rssdk format file name and set the recording mode to false.
For file playback, the SDK immediately creates the Capture instance (using the QueryCapture
function), so that the application can query the capabilities of the recorded content. You can configure the SDK file playback behaviors as follows:
Function |
Default |
Description |
SetPause
|
false
|
If true, the file playback returns the same sample of the current frame repeatedly.
|
SetRealtime
|
true
|
If true, the file playback returns the current frame sample at its presentation time (according to the sample time stamp.) If false, the file playback returns the sample immediately.
|
Choose pause=true and realtime=false if you want to accurately locate any frame data during playback. Here is the sample codes to show how to use SetRealtime
and SetPause
:
PXCSenseManager* sm = PXCSenseManager::CreateInstance();
sm->QueryCaptureManager()->SetFileName(filename, false);
sm->EnableStream(PXCCapture::STREAM_TYPE_COLOR, 0, 0);
sm->Init();
sm->QueryCaptureManager()->SetRealtime(false);
sm->QueryCaptureManager()->SetPause(true);
for (int i = 0; i < nframes; i+=3) {
sm->QueryCaptureManager()->SetFrameByIndex(i);
sm->FlushFrame();
pxcStatus sts = sm->AcquireFrame(true);
if (sts < PXC_STATUS_NO_ERROR) break;
PXCCapture::Sample* sample = sm->QuerySample();
....
sm->ReleaseFrame();
}
sm->Release();
In the above discussion, we can see with this new Intel® RealSense™ SDK feature, developers can embed code to record camera streaming sequences into compressed files on disk, and playback later. This feature is very helpful in debugging and troubleshooting issues in the apps.