Introduction
Last year I did publish the cam2web open source project, which is aimed for streaming cameras over HTTP as MJPEG streams. Although being developed to support number of platforms, the very original target was to stream camera from Raspbery Pi, so it could be watched remotely. That project by itself can be used for a number of things like home security, IoT devices, etc. However, originally it was started as the first step for another project - building a remote controlled robot.
I did some hobby robotics projects in the past. However, those were based on some specialized controllers, which was making them a bit costly. This time the plan was to use more or less conventional components, which could be reused from other electronics/robotics/IoT projects and to keep the total price down.
The idea I had in mind was simple enough - to build a remote controlled robot, which has a camera on board and so allows seeing from the "robot's eyes" remotely. Obviously it was supposed to be able to move around, so wheels and motors is a must. Additionally some sensors could be of use, like ultra sonic distance sensor, for example.
Another idea was to develop some reusable software, which could be of use to control robots built to the similar spec. It should allow either some simple robot control directly from a web browser or development of specialized client applications to provide more advanced control and adding extra features, like computer vision, for example.
And so it was decided to build the robot based on a Raspberry Pi board. It comes for a relatively small price and provides enough power and connectivity to handle all the other electronics required for turning the idea into action.
As for the name, it was called PiRex. This was a result of brief brain storming with kids over a walk. Why PiRex? Well, as my youngest said - it is a Rex based on Pi. No, it does not look anyway scary as Tyrannosaurus Rex. But the name just got stuck.
Quick preview
Before going into details of the building process and the software developed, it would be nice to have a quick preview of the final result. Here is the way it did look in the end.
Nothing too fancy - just a wheeled robot. And yes, it is not really a Rex. More of lunch box on wheels, equipped with camera, distance sensor, Wi-Fi antenna and a battery module.
Also, here is a quick video demo, showing control of the PiRex robot from a web browser and from a dedicated .NET client application. The web UI makes it all simple - run the software on Raspberry Pi and the robot is ready to be controlled from the web browser of your choice. While the client application allows more agile control with game pad devices, for example, and integration of different image processing SDKs to process video coming from the bot.
GitHub project
All of the developed software is published on GitHub. So, if anyone would like to repeat the building steps and start playing with the robot, it is all there for a quick start, providing the base for any customizations in mind.
Hardware
Before we start with writing/compiling any code, let's get the robot built first. For that we better get all the parts first, so that we could estimate the total cost and think about how to put these all together. Of course things can be done iteratively, adding new components as things progress. However, it may result in a complete rebuild of the project, sometimes even more than once, which is time and money (or just money2). The issue we'd better avoiding is to start with few components we think are the most critical, build something with it and then realize that there is not enough space to fit the next component, or there is not enough of battery power for all electronics we need, or the final assembly gets heavy enough and chosen motors simply cannot do much.
How many of those mistakes happened to me while getting to the final working result? Two. First, could not fit everything into a compact design initially planned. And then found the motors of the first choice were too cheap and did not have enough power to get the robot moving.
Bill of materials
Below is the list of components, which were used for building the PiRex robot, and their price estimate (based on Amazon or any other on-line shops which fit better):
- Raspberry Pi board - 32.00£.
- Raspberry Pi Camera Module - 20.00£.
- 4 x 6V 60 RPM Motor, DiddyBorg v1 Wheel & Mounting Kit - 60.00£. Yes, looks like an investment. But it is a simple fact - good motors are not cheap; cheap motors are not good. Avoid any cheap plastic aimed for Arduino projects - it will not carry any more or less serious load.
- USB Wi-Fi Module with Antenna - 10.00£. The module with antenna is strongly preferred over the one without it, since signal quality may cause nasty communication issues.
- Portable charger for phones/tablets, 5V output - used for powering Raspberry Pi board only. The one I've got does not seem to be available any more. So assume something for about 20.00£.
- L293D Stepper Motor Driver. Usually comes in packs of 5 for about 2.00£.
- HC-SR04 Ultrasonic Distance sensor - price varies a bit, let’s assume 4.00£.
- 8 GB micro SD card for Raspberry Pi image - around 6.00£ depending on model.
- Raspberry Pi camera cable - 4.50£. This might be an optional component depending on how things are built. The default cable was a bit too short for my set-up, so a longer one was required.
- 4 x AA batteries to power motors. Got rechargeable Duracell, which may cost around 8.00£.
- 4 x AA 6V Battery Holder Case. About 1.00£ for two.
- Prototyping PCB Circuit Boards. About 5.00£ for a pack of 20.
- 16-Pin IC Socket may come handy to avoid frying the motor driver chip when soldering - under 1.00£ for a pack of 5.
- USB Micro B Breakout Board to get 5V from power supply and direct it to Raspberry Pi - Aroud 4.00£. It is unlikely you manage to connect Raspberry Pi directly to power supply, because it may be well hidden in your setup. So having a micro USB breakout board somewhere on the outside may help a lot. Plus, having a switch button between the power supply and the Pi is much cleaner – avoids pulling any cables to switch the robot on/off.
- PCB Mount Screw Terminal Blocks - 1.50£ for pack of 20. An optional component, but comes very handy for connecting different components.
- A lunch box. Yes, you are right, I used a lunch box for the robot's body. Did not have access to 3D printer to craft custom body, so the choice went for something easy to cut and not so heavy at the same time. Let’s say 2.00£.
- Finally, some wires, resistors, few LEDs for status indication, few switch buttons to turn things on and off, some nuts and bolts, etc. Had enough of all these from other projects, but lets say another 10.00£ (should be more than enough).
All together it comes to 191.00£. To get it safe, in case something was forgotten or simply have some spare budget, lets plan about 200£ for a Raspberry Pi based robot.
It may not sound very cheap. But there is little to do to get the final price lower. The most expensive components are motors with wheels. Well, you have to accept this. Trying to save on them and getting something cheaper may result in increased price in the end, when it gets clear the robot can not really move. Saving much on electronic components or batteries may be tough as well.
Another option would be to use some ready made robotics kits. It really does save time. But as for me, it takes away fun and creativity as well. Anyway, many of the kits are more expensive actually, even without all electronic components included.
Building it all together
Once all components are at hand, it is time put them all together into a working robot. To control motors a simple L293D chip is used, which allows controlling up to two motors independently. The chip itself can only turn motors on/off and control direction of their rotation. Connecting its 1,2EN and 3,4EN pins (pins 1 and 9, see L293D data sheet) to a pulse modulation module (PWM) also allows controlling speed of motors. However, Raspberry Pi has only one hardware PWM, so controlling speed of both motors is not available (unless software PWM is used, which is to be mentioned later).
Note: for a quick introduction to L293D, its wiring and simple control have a look at this tutorial: Controlling DC Motors Using Python With a Raspberry Pi.
To make things easier to connect, it is preferred to put smaller components on individual circuit boards. For example, as the picture below shows, I put the L293D chip on its own breakout board, which has individual terminal blocks for connecting motors, power supply and Raspberry Pi's pins. This way all the soldering is concentrated only on these little boards, while the rest of connections are done by screwing a wire into a terminal block on one side and attaching to Raspberry Pi's pin on the other side.
The second breakout board holds 2 LEDs (used to show run time and connectivity status) and terminal block for connecting two switch buttons (planned to be used for some interaction with the robot). All required resistors are there as well; only connectivity to Raspberry Pi and components is missing. While LEDs and switches are really optional for the final result, this board is also used for connecting to HC-SR04 ultrasonic sensor (the lower two resistors for voltage divider and a terminal block).
Note: for more information about wiring and using HC-SR04 sensor have a look this tutorial: Interfacing HC-SR04 Ultrasonic Sensor with Raspberry Pi.
Having the two breakout boards above, its time to pack them into our lunch box together with some other components - this completes the first layer of the robot. Some of the terminal blocks already get connected to motors and switches, while the others wait for adding the second layer holding a Raspberry Pi. Note: there are two extra switches which are set loose for now. Those are to be used to switch the robot on/off and to provide power supply to motors (motors are not really needed during most of the software debugging, so making them independent is quite handy).
The second layer of the assembly holds a Raspberry Pi and another small breakout board, which is used to get power supply from micro USB sockets and direct it to electronic components (through switches to get it all on/off). Initially it was planned to power both Raspberry Pi and motors using the same portable charger battery (which does provide two outputs), hence there are two USB connector on the breakout board. However, later it was decided to power motors separately with a pack of AA batteries and so the breakout board got a bit extended with an extra terminal block.
Now, adding the final and the main component, Raspberry Pi, completes the assembly to about 99%. Here is the way it looks if we pop the hood of the PiRex.
Finally, adding batteries and a Wi-Fi module, makes it ready for action. Well, provided everything is connected right and the software is there.
Connecting components
Although it is completely customary which GPIO pins to use for connecting different components to Raspberry Pi (the software discussed below can be configured for this), below is the pin out I've used while building the PiRex robot. Just as a reference.
Power supply for 2 switch buttons | 3.3v | 1 | 2 | 5v | HC-SR04 Power supply (VCC) |
| GPIO 2 | 3 | 4 | 5v | L293D chip power supply (VCC1, pin 16) |
| GPIO 3 | 5 | 6 | GND | Ground for LEDs and switch buttons |
Run time status LED (running software) | GPIO 4 | 7 | 8 | GPIO 14 | |
L293D ground (pins 4, 5, 12, 13) | GND | 9 | 10 | GPIO 15 | |
Connectivity status LED | GPIO 17 | 11 | 12 | GPIO 18 | Enable left motors (1,2EN, pin 1) |
Signal from switch button 1 | GPIO 27 | 13 | 14 | GND | HC-SR04 ground |
Signal from switch button 2 | GPIO 22 | 15 | 16 | GPIO 23 | Left motors input 1 (pin 2) |
| 3.3v | 17 | 18 | GPIO 24 | Left motors input 2 (pin 7) |
| GPIO 10 | 19 | 20 | GND | |
| GPIO 9 | 21 | 22 | GPIO 25 | HC-SR04 trigger |
| GPIO 11 | 23 | 24 | GPIO 8 | |
| GND | 25 | 26 | GPIO 7 | |
| GPIO 0 | 27 | 28 | GPIO 1 | |
Right motors input 1 (pin 15) | GPIO 5 | 29 | 30 | GND | |
Right motors input 2 (pin 10) | GPIO 6 | 31 | 32 | GPIO 12 | |
Enable right motors (3,4EN, pin 9) | GPIO 13 | 33 | 34 | GND | |
| GPIO 19 | 35 | 36 | GPIO 16 | |
HC-SR04 echo, through voltage divider | GPIO 26 | 37 | 38 | GPIO 20 | |
| GND | 39 | 40 | GPIO 21 | |
This is all about building the robot and connecting different components together. Having L293D driver chip's and HC-SR04 ultrasonic sensor's data sheets and following some tutorials, it should not be hard to figure out which pins go where and to get something working in a test application first.
Software
The software running on the PiRex bot's side is heavily based on the cam2web code base. In fact, about 95% of the code is taken from there. The reason for this is that cam2web provides most of the heavy infrastructure needed for this project - image acquisition from camera, embedded web server based on mongoose library, streaming images as MJPEG stream, infrastructure classes allowing to query/configure objects through REST API, etc.
If we have a look at the core class diagram of the PiRex code, then we'll find that only two classes were added to the project (plus some glue code of course). This explains why cam2web was really the first big step towards building this robotics project.
Let's do a quick overview of what we've got from the cam2web project.
The XWebServer
class implements an embedded web server based on mongoose library API. It allows registering collection of objects implementing IWebRequestHandler
interface, which provide some response to a particular request. For example, the XVideoSourceToWeb
class accepts an arbitrary video source (a class implementing IVideoSource
interface and producing new images via a IVideoSourceListener
call back interface) and provides two request handlers - one, which streams images as MJPEG stream, and another, which provides images as single JPEG snapshots.
The XEmbeddedContentHandler
class allows serving static content from embedded resources. The cam2web project comes with a web2h tool, which allows converting certain types of files (web resources) into C header files. This way all required web content can be embedded directly into the final executable making it independent of any external files.
Finally, the XObjectInformationRequestHandler
and XObjectConfigurationRequestHandler
classes provide either read-only access through REST API to objects implementing IObjectInformation
interface, or read-write access to objects implementing IObjectConfiguration
interface. For example, information about PiRex version and capabilities is provided through read-only interface, while camera'a settings can be changes and so exposed through IObjectConfiguration
interface.
To demonstrate the use of the above mentioned classes, here is a small code snippet, which configures a web server to stream Raspberry Pi's camera, allow changing its setting via REST API, query some information about the device and serve some static embedded content. This all came from the cam2web project and it is really easy in use. So why not to reuse?
XWebServer server;
XVideoSourceToWeb video2web;
shared_ptr<XRaspiCamera> camera = XRaspiCamera::Create( );
camera->SetListener( video2web.VideoSourceListener( ) );
server.AddHandler( video2web.CreateJpegHandler( "/camera/jpeg" ) ).
AddHandler( video2web.CreateMjpegHandler( "/camera/mjpeg", 30 ) );
shared_ptr<IObjectConfigurator> cameraConfig = make_shared<xraspicameraconfig>( camera );
server.AddHandler( make_shared<XObjectConfigurationRequestHandler>(
"/camera/config", cameraConfig ) );
PropertyMap versionInfo;
versionInfo.insert( PropertyMap::value_type( "product", "pirexbot" ) );
versionInfo.insert( PropertyMap::value_type( "version", "1.0.0" ) );
versionInfo.insert( PropertyMap::value_type( "platform", "RaspberryPi" ) );
server.AddHandler( make_shared<XObjectInformationRequestHandler>(
"/version", make_shared<XObjectInformationMap>( versionInfo ) ) );
server.AddHandler( make_shared<XEmbeddedContentHandler>( "/", &web_index_html ) ).
AddHandler( make_shared<XEmbeddedContentHandler>( "index.html", &web_index_html ) ).
AddHandler( make_shared<XEmbeddedContentHandler>( "styles.css", &web_styles_css ) );
camera->Start( );
server.Start( );
</xraspicameraconfig>
The code above is not a complete demonstration of the cam2web's features. For example, it does not show how to configure access rights - which request handlers can be accessed by everyone and which can be accessed only by known users. That can be found either from the original artucle describing cam2web project or diving into the provided source code.
Controlling motors
Controlling motors with a L293D chip is really easy and there are plenty of tutorials on this topic. Most of them are Python based, but it is really trivial to translate those to C code. To get motors moving, all we need to do is to set Enable pin to High and then set Input1 pin to High while Input2 pin to Low. To change direction of rotation it is required to swap input pins - set Inpun1 pin to Low while Input2 pin to Hight. And if we want to stop it all - just set Enable pin back to Low.
To manipulate Raspberry Pi's pins from C application the Wiring Pi library can be used. It does come pre-installed with an official Raspbian image. Here is a quick sample how to get motors moving.
#include <wiringPi.h>
pinMode( BOT_PIN_MOTOR_LEFT_INPUT1, OUTPUT );
pinMode( BOT_PIN_MOTOR_LEFT_INPUT2, OUTPUT );
pinMode( BOT_PIN_MOTOR_LEFT_ENABLE, OUTPUT );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT1, HIGH );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT2, LOW );
digitalWrite( BOT_PIN_MOTOR_LEFT_ENABLE, HIGH );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT1, LOW );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT2, HIGH );
digitalWrite( BOT_PIN_MOTOR_LEFT_ENABLE, HIGH );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT1, LOW );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT2, LOW );
digitalWrite( BOT_PIN_MOTOR_LEFT_ENABLE, LOW );
The code above allows controlling direction of motors' rotation, but not the speed. To get speed control, it is required to connect Enable pin to a PWM enabled pin on Raspberry Pi. The problem is that Pi has only one hardware PWM and so controlling speed of both left and right motors cannot be done with hardware support. As an alternative solution a software PWM can be used, which can be enabled on any of the Raspberry Pi's GPIO pins. The code below demonstrates the speed control based on software PWM.
pinMode( BOT_PIN_MOTOR_LEFT_INPUT1, OUTPUT );
pinMode( BOT_PIN_MOTOR_LEFT_INPUT2, OUTPUT );
softPwmCreate( BOT_PIN_MOTOR_LEFT_ENABLE, 0, 100 );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT1, HIGH );
digitalWrite( BOT_PIN_MOTOR_LEFT_INPUT2, LOW );
softPwmWrite( BOT_PIN_MOTOR_LEFT_ENABLE, 50 );
softPwmWrite( BOT_PIN_MOTOR_LEFT_ENABLE, 100 );
softPwmWrite( BOT_PIN_MOTOR_LEFT_ENABLE, 0 );
Although software PWM does allow controlling speed of motors, it is not as efficient as hardware PWM. The Wiring Pi library creates a background thread for each software PWM configured, which does frequent updates to the state of selected GPIO pin. As a result it increases CPU load, which in turn discharges battery quicker. However, software PWM not only affects system performance, but also can be affected by it. If CPU load gets high due to handling other computations, the software PWM's thread may not get a chance to update GPIO state at the time intervals required for smooth speed control. Because of the these issues, motors' speed control is disabled by default in PiRex configuration.
Measuring distance to obstacles
Distance measurements with HC-SR04 ultrasonic sensor is a very easy task to do as well. Being an extremely popular sensor, it is used a lot in many hobby projects and so finding tutorial about it is not hard. The idea of the sensor is quite simple. It emits an ultrasound wave, which gets reflected from an obstacle and comes back to the sensor. And so it is simply required to measure the time taken after emitting the sound wave and detecting its return. Since speed of sound is known, calculating distance to an obstacle becomes trivial.
HC-SR04 does not emit sound waves constantly. Instead, it needs to be told when to emit one and then wait till the sensor gets reflection back. Two GPIO pins are used for interaction with the sensors. The Trigger pin is used to tell sensor when to send a sound wave - it must be set to High for a short period of time and then set back to Low. The Echo pin is then used to detect when sound wave was sent and when it came back. After the measurement was triggered, the Echo pin goes High on sending the wave and gets back to Low on its return. Putting this all into the code may looks something like this:
uint32_t start, stop;
digitalWrite( BOT_PIN_ULTRASONIC_TRIGGER, HIGH );
delayMicroseconds( 10 );
digitalWrite( BOT_PIN_ULTRASONIC_TRIGGER, LOW );
start = micros( );
while ( digitalRead( BOT_PIN_ULTRASONIC_ECHO ) == LOW )
{
start = micros( );
}
stop = micros( );
while ( digitalRead( BOT_PIN_ULTRASONIC_ECHO ) == HIGH )
{
stop = micros( );
}
float lastDistance = (float) ( stop - start ) / 58.2f;
The above example code represents slightly simplified version, but gives the idea. To improve the code, it is required to handle possible timeouts, when an echo wave did not start/stop at expected time interval.
Configuring the PiRex software
Although the reference GPIO layout was provided above (which Raspberry Pi’s pins connected to what), it does not have to be that way for using PiRex software. As a minimum requirement, it is expected the robot to have camera and the L293D based motor driver. However, the way it is all connected does not really matter. It can be configured instead. Same about the ultrasonic sensor – it is an optional component and so may not be attached at all.
In order to customize the build, there is the BotConfig.h
header file. It does contain number of #define’s
allowing to specify which Raspberry Pi’s pins are used for which purpose. Simply change that to reflect your setup and you are ready to go.
Building and running
Building the PiRex code is as easy as running make
command. The only thing which must be kept in mind is that release build embeds all web resources directly into the executable, so that default web UI could be always provided without relying on any extra files. This means that the web2h
tool must be build first - the tool is used to translate some common web files into C structures defined in header files.
Running bellow commands from the project’s root folder, will produce the required executables in build/release/bin.
pushd .
cd src/tools/web2h/
make
popd
pushd .
cd src/app/
make
popd
Note: libjpeg development library must be installed for PiRex build to succeed (which may not be installed by default):
sudo apt-get install libjpeg-dev
Once the build completes, the PiRex software is ready to run. This means the robot is ready for action, provided the hardware part is also done and the configuration reflects the actual GPIO pins' connections.
Note: the are number of command line options, which can be used to change default camera resolution, web port to listen on, authentication settings, etc. All of those were inherited from the cam2web project. Just run the application with the -? key to see the list of available configuration options.
Web interface
As it was already mentioned, the PiRex application comes with a built-in web UI. This means that once application is running, all you need to start controlling your robot is to type IP address:port into a web browser of your choice and get connected to it.
The default view simply shows current view from the robot's camera. However, the "Control" tab allows controlling the robot's movement and getting distance measurements, while the "Camera" tab allows changing different camera's settings.
The GitHub repository provides the source code of the default web UI, so if anything needs changing - it is all there. Just don't forget to rebuild the application. Another approach would be to get all required web content into a separate folder and then tell PiRex application to serve static web content from there (using the -web:folder_name option). This would make it quicker to debug the web UI part before embedding final content into the executable. Note: building in debug mode automatically populates the folder with web content and application is serving it from there.
.NET client application
Although the PiRex robot can be controlled directly from a web browser, which requires no extra software on the client side, there are cases when a dedicated application may fit better the task. For example, I found that manipulating robot by clicking UI buttons does not give the best control and as a result does not allow performing some of the tricks easily. Instead, using a game pad device and controlling left/right motors with individual axes gives much better control of the robot, allowing to switch faster between different movement patterns. Another example would be to use some image processing and/or computer vision algorithms on the video coming from robot’s camera. A dedicated client application may fit better many of such task, while a browser environment can get much more complicated (if possible at all) implementing them.
To provide extra flexibility in controlling the PiRex robot, a .NET client application is provided as well, which allows viewing robot’s camera and manipulate the robot by means of exposed REST API. To demonstrate alternative way of control, it supports game pad devices, so that robot’s manipulations become much more agile.
Combining the .NET client with any of the available computer vision SDKs, may turn the robot into a development platform for many interesting applications. It can be used either for some fun games aimed for finding objects hidden in real world environment, or for some inspections tasks where smaller robots fit better the cluttered environment with limited amount of space to move around.
Interfacing with other applications - REST API
As the above described application suggests, the PiRex bot provides API for interfacing with other applications, so that a native client could be developed and provide more advanced robot control or add extra features, like computer vision, for example.
The first thing is to get video stream out of the robot, which is provided in the form of MJPEG stream. It does not provide the best compression - just a stream of individual JPEGs. However, it is very simple to implement and so supported by great variety of applications. The URL format to access MJPEG stream is:
http://ip:port/camera/mjpeg
In the case an individual image is required, the next URL provides the latest camera snapshot:
http://ip:port/camera/jpeg
All the other URLs provide either some information about a robot (version, state, configuration), which is provided in JSON format when performing HTTP GET request, or can be used to control the robot - HTTP PUT request with JSON formatted command.
Getting version information
To get information about version of the PiRex bot's software, the http://ip:port/version URL can be used, which provides information in the format below:
{
"status":"OK",
"config":
{
"platform":"RaspberryPi",
"product":"pirexbot",
"version":"1.0.0"
}
}
Getting capabilities and title
To find if PiRex bot is equipped with distance measurement sensor or allows speed control of motors, the http://ip:port/info URL is used. It also reports bot's title, which can be specified as one of the supported command line options.
{
"status":"OK",
"config":
{
"device":"PiRex Bot",
"providesDistance":"true",
"providesSpeedControl":"false",
"title":"My Home Robot"
}
}
Distance measurement
For querying distance measurements performed by PiRex robot, the http://ip:port/distance URL is used. It provides as the most recent measurement in centimetres, as the median value taken from the last 5 measurements.
{
"status":"OK",
"config":
{
"lastDistance":"128.95",
"medianDistance":"127.25"
}
}
Controlling motors
For motors control, the http://ip:port/motors/config URL is used. If GET request is sent, then the reply simply contains current state of the motors. This is not of much use though, since most of the time motors are stationary unless told to move.
{
"status":"OK",
"config":
{
"leftPower":"0",
"rightPower":"0"
}
}
Sending a PUT request, however, is what's needed for telling robot to move. This is done with a simple JSON string, which tells power of both motors in the [-100, 100] range. In the case speed control is not enabled, there are only three possible values: 100 - rotate forward, 0 - don't move, -100 - rotate backward (although the robot will accept intermediate values as well, but threshold them). In the case if speed control is enabled, the speed value can be anything from the mentioned range.
For example, the command below makes robot to rotate clockwise (rotate right).
{
"leftPower":"100",
"rightPower":"-100"
}
Camera configuration
Retrieving current configuration of PiRex robot's camera can be done using GET request sent to the http://ip:port/camera/config URL, which lists current values of all available properties:
{
"status":"OK",
"config":
{
"awb":"Auto",
"brightness":"63",
"contrast":"41",
"effect":"None",
"expmeteringmode":"Average",
"expmode":"Night",
"hflip":"1",
"saturation":"16",
"sharpness":"100",
"vflip":"1",
"videostabilisation":"0"
}
}
Again, for changing any of the properties, a POST request must be sent to the same URL, providing one or more configuration values to set. For example, below is the command for changing both brightness and contrast:
{
"brightness":"50",
"contrast":"15"
}
Camera information
To get some of the information about the robot's camera, like its current resolution, the http://ip:port/camera/info URL is available.
{
"status":"OK",
"config":
{
"device":"RaspberryPi Camera",
"height":"480",
"title":"Front Camera",
"width":"640"
}
}
Getting description of camera properties
Finally, it is possible to query description of all supported camera's configuration properties by using the http://ip:port/camera/properties URL. This API is inherited from the cam2web project, where it does make sense, since that projects supports number of platforms and camera APIs. However, for PiRex it is of little use really - only one camera type is supported for now.
Conclusion
It is time to wrap it up. Looking back at what was done to get the PiRex going, I would say it was really interesting experience as from the software development point of view, as from the hardware assembly. It was fun building it all, starting from soldering individual small components, to getting it all together and making sure it really does work. Yes, it was rebuilt couple of times - to replace a burned motor driver, to fit new more powerful motors, to find loose connection, etc. But at the end of the day it was nice seeing it progressing and improving.
Was the target goal achieved? Sure, it was. The robot was built within the reasonable budget out of conventional electronic components these days. Since it is all built using smaller individual components rather than specialized robotics controller having all in one, it is much easier to replace/upgrade things. Which works well in repairing things as well – if motor driver got burned, we just need to replace a small chip which costs little, instead of replacing the entire robotics controller.
As for the developed software it worked quite good as well. The PiRex robot can be controlled either from a web browser UI or from a dedicated application by calling the exposed REST API. A reference .NET client application is provided to demonstrate how to interface with the robot and how to implement a more agile control with the help of game pad device.
Are there any things which could have been done different? If budget allows, a more advanced motor driver could be used, like ThunderBorg, for example, which provides speed control out of the box. Or, an alternative solution could be adding Arduino Nano, which brings enough hardware PWM pins to implement speed control with the L293D motor driver. This option sounds actually more interesting, since an Arduino board brings not only extra digital IO pins, but analog input pins as well, which can be of great use interfacing with different sensors.
Anyway, the PiRex software in its current shape is really easy to reuse and extend. The provided embedded server and the rest of infrastructure allows plugging new REST handlers easily, which then interact with additional hardware components, if needed.
An after project
What could be next? Well, once a robot is built and functions as expected, it could be used as a platform for many different projects. Especially when combining it all with some image processing and computer vision applications. Below is a quick demo of one such projects – hunting glyphs with a PiRex robot. A set of square binary glyphs is hidden in some environment and the target it to find them all by controlling the robot remotely. I must say it was a great fun playing it with kids!
Well, I am sure many more great ideas can be implemented. Just be creative and keep going. Have fun!