Introduction
This short program shows how a live video stream from a web cam (or from a video file) can be rendered in OpenGL as a texture. The live video stream is captured using the Open Source Computer Vision library (OpenCV). The program also shows how an OpenCV image can be converted into OpenGL texture. This code can be used as the first step in the development of an Augmented Reality application using OpenCV and OpenGL.
Understanding the Code
The program renders an OpenGL textured quad which shows a live video stream. The code does not contain any additional functionality, and is kept very simple for easy understanding.
The OpenGL texture is continuously created in the OnIdle
callback function. The next available frame in the video stream is captured first:
IplImage *image = cvQueryFrame(g_Capture);
The image is stored in the OpenCV data structure IplImage
. Please see the OpenCV documentation for details. The image captured by OpenCV is stored as a BGR
. It is first converted to RGB
using the OpenCV function cvCvtColor
.
cvCvtColor(image, image, CV_BGR2RGB);
Then, the following magic call creates a 2D OpenGL texture from the OpenCV image:
gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB, image->width, image->height,
GL_RGB, GL_UNSIGNED_BYTE, image->imageData);
The texture is loaded into memory and is available for rendering.
Compiling and Running
The code is compiled and tested using Microsoft Visual Studio 2008. However, it can be compiled using any C++ compiler on any platform. The program uses the OpenGL, GLUT, and OpenCV libraries. Please make sure that you have installed them and paths to the include and lib directories are set. OpenCV can be downloaded from here.