Introduction
It so happened that from time to time during the past two years I was working on several research projects in the areas of Computer Vision and Artificial Intelligence. As a result of this work, a lot of code was produced and several articles on Code Project were published that describe some of these areas. Publishing these articles, I discovered that these areas are interesting not only to me, but to a wide range of developers as well. From the time of my first publication on Code Project, I received many different e-mails from many interesting people all over the world, who were applying some of my code to a great number of applications. Some of these people were interested in not just using the code, but in extending it and contributing to these projects. From the very first such offer, the idea of an open source project would not leave me for a long time.
The aim of this article is to make its publication to be an official opening of a new open source project AForge.NET - a C# framework for researchers in different areas of Computer Vision and Artificial Intelligence. The framework summarizes most of my previous work being done in these areas and is going to be extended more and more with new ideas. At the present moment the framework consists of 3 significant parts, which were discussed in detail in some of my previous articles:
- Image processing
- Neural networks
- Evolution algorithms
Starting this project, the idea was not just in summarizing all the previous source codes from different areas and providing them to the community as they are. Besides several libraries and their sources, the framework also provides many sample applications that demonstrate the use of this framework, and documentation files in HTML Help format that can be used as a reference. The aim of this project is not only to extend its functionality, but to also provide support for it, improving and extending its documentation and sample application set.
Image processing library
The image processing library contains a set of image processing filters and tools designed to address many different tasks of computer vision [^] and image analysis/processing [^]. At the moment the library contains the below set of filters, which is growing more and more as new ones develop:
- Color filters (grayscale, sepia, invert, rotate channels, channel extraction, channel replacing, channel filtering, color filtering, Euclidean color filtering, RGB channals linear correction)
- HSL filters (linear correction, brightness, contrast, saturation, hue modifier, HSL filtering)
- YCbCr filters (lenear correction, YCbCr filtering, channel extraction/replacement)
- Binarization filters (threshold, threshold with carry, ordered dithering, Bayer dithering, Floyd-Steinberg, Burkes, Jarvis-Judice-Ninke, Sierra, Stevenson-Arce, Stucki dithering methods)
- Adaptive binarization (simple image statistics);
- Mathematical morphology filters (erosion, dilatation, opening, closing, hit & miss, thinning, thickening)
- Convolution filters (mean, blur, sharpen, edges, gaussian blur, sharpenning based on gaussian kernel)
- 2 Source filters (merge, intersect, add, subtract, difference, move towards, morph)
- Edge detectors (homogeneity, difference, sobel, canny)
- Gamma correction, Median filter
- Conservative smoothing, jitter, oil painting, pixellate, simple skeletonization
- Blob counter and connected components labeling filter
- Texture generators (clouds, marble, wood, labyrinth, textile), texturer, textured filter, texture merge filter
- Resize and rotation (nearest neighbor, bilinear, bicubic)
- Frequency filtering with FFT
- Image statistics
Before starting the use of the library routines, it is required to ensure that the source image has one of two formats supported by the library (24 bits per pixel color image or grayscale image represented as 8 bit per pixel indexed image):
System.Drawing.Bitmap image = (Bitmap) Bitmap.FromFile( fileName );
AForge.Imaging.Image.FormatImage( ref image );
The library describes two main interfaces, IFilter
and IInPlaceFilter
, which should be implemented by all image processing filters. The fist interface is obligatory for all filters and describes their functionality. This allows the filter's application to the source image without its modification. Instead of this, a new image is returned as a result of the image processing routine, but the source image is left untouched. The second interface is implemented only by those filters, which may be applied directly to the source image, updating it in the result of the image processing routine.
The below sample code demonstrates the use of a filter that can be applied directly to the source image:
HSLFiltering filter = new HSLFiltering(
new IntRange( 330, 30 ),
new DoubleRange( 0.5, 1 ),
new DoubleRange( 0, 1 ) );
filter.UpdateLuminance = false;
filter.UpdateHue = false;
filter.ApplyInPlace( sourceImage );
The above sample demonstrates the use of an HSL filter, which filters an image keeping only pixels within the specified HSL range and clearing pixels outside the range. With the help of additional configuration properties, it is possible to clear not the entire pixel, but only certain HSL channels. The above sample keeps only red colors with saturation values above 0.5; other colors are converted to grayscale:
The next sample demonstrates the use of a filter, which produces a new image as a result of its work:
IFilter filter = new FloydSteinbergDithering( );
Bitmap newImage = filter.Apply( sourceImage );
This sample demonstrates the use of a binarization algorithm known as Floyd-Steinberg binarization:
Neural networks library
The neural network library implements some common popular neural network concepts. It may be applied to a range of problems that can be solved with multi-layer feed-forward networks using supervised learning algorithms, or with self-organizing networks using unsupervised learning algorithms [^ ]. In designing the library, the main idea was to keep its flexibility and reusability. This would make it as easy to extend the library with new neural network architectures and learning algorithms as it would be to apply it towards a vast range of other problems to solve.
For example, let's take a look at how to solve some common problems with the library. The code sample below demonstrates how to apply the library to a classification problem:
double[][] input = new double[samples][];
double[][] output = new double[samples][];
ActivationNetwork network = new ActivationNetwork( new ThresholdFunction( ),
2, classesCount );
PerceptronLearning teacher = new PerceptronLearning( network );
teacher.LearningRate = learningRate;
while ( ... )
{
double error = teacher.RunEpoch( input, output );
...
}
The above code sample may be divided into 3 main parts: 1) preparing learning data, 2) creating and initializing neural network and learning algorithms and 3) teaching the network. Let's take a look at another code sample, which solves a very different problem - approximation.
double[][] input = new double[samples][];
double[][] output = new double[samples][];
ActivationNetwork network = new ActivationNetwork(
new BipolarSigmoidFunction( sigmoidAlphaValue ),
1, neuronsInFirstLayer, 1 );
BackPropagationLearning teacher = new BackPropagationLearning( network );
teacher.LearningRate = learningRate;
teacher.Momentum = momentum;
while ( ... )
{
double error = teacher.RunEpoch( input, output ) / samples;
...
}
The code looks rather similar to the above one and may also be divided to the same 3 parts. The only difference between these two code samples is the learning data preparation routine and some network/learning algorithm parameters. Of course, different problems have different input and output data, so they may differ in the way of data preparation for neural network learning. But for the rest, these two code samples are very similar in concept.
The above two samples demonstrated how to use supervised learning algorithms and feed-forward networks. Now let's take a look at another sample, which utilizes an absolutely different neural network architecture – the Kohonen Self-Organizing Map applied to the color clustering task:
Neuron.RandRange = new DoubleRange( 0, 255 );
DistanceNetwork network = new DistanceNetwork( 3, 100 * 100 );
SOMLearning trainer = new SOMLearning( network );
double[] input = new double[3];
while ( ... )
{
input[0] = rand.Next( 256 );
input[1] = rand.Next( 256 );
input[2] = rand.Next( 256 );
trainer.Run( input );
...
}
The concept of this sample is in some ways similar to the two above samples. Yes, it skips the first step of preparation data. But the sample does it just before running each learning iteration. Instead of running a learning algorithm for the entire data set (learning epoch), the learning algorithm runs only for the just-prepared data sample.
Evolution algorithms library
The evolution computation library implements several popular algorithms, such as Genetic Algorithms (GA), Genetic Programming (GP) and Gene Expression Programming (GEP). This makes it applicable to many different types problems [^]. The design idea for this library was kept the same as for the entire library – making it flexible, reusable and easy to use.
As in the case of the neural network library, the use of the evolution library is simple and analogous for a variety of problems. To illustrate, we'll take a look at two examples: 1) function optimisation and 2) function approximation.
Function optimization
public class UserFunction : OptimizationFunction1D
{
public UserFunction( ) :
base( new DoubleRange( 0, 255 ) ) { }
public override double OptimizationFunction( double x )
{
return Math.Cos( x / 23 ) * Math.Sin( x / 50 ) + 2;
}
}
...
Population population = new Population( 40,
new BinaryChromosome( 32 ),
new UserFunction( ),
new EliteSelection( ) );
population.RunEpoch( );
Function approximation
double[,] data = new double[5, 2] {
{1, 1}, {2, 3}, {3, 6}, {4, 10}, {5, 15} };
Population population = new Population( 100,
new GPTreeChromosome( new SimpleGeneFunction( 6 ) ),
new SymbolicRegressionFitness( data, new double[] { 1, 2, 3, 5, 7 } ),
new EliteSelection( ),
0.1 );
population.RunEpoch( );
The above two sample codes look rather similar. The greatest difference is the section defining the fitness function for the evolution algorithm. In the first sample, the fitness function is defined by the definition of the function to be optimized. In the second sample, a standard fitness function from the library is used and only initialization data are prepared for it. The rest of these two samples may differ in certain details, but still look very similar in concept.
The fact that most samples are so similar is achieved by implementing most entities from evolution computation in separate classes. This allows their easy reuse and combination, like a Lego structure, to solve a particular task.
More samples
As was stated in the beginning of this article, the framework comes not only with a set of libraries and their sources, but it also provides a set of sample applications for each area of the framework. From this perspective, visiting the project's home page is recommended in order to get all of the latest updates and releases, and also to get support and to participate in the discussion group.
Project home page
As is very common for Open Source projects, this one has a home page, which provides access to the project's information, source codes, stable releases, discussion groups and issue tracking system. At the moment, the project has its space on the Google's Code project and it is accessible via the next link: http://code.google.com/p/aforge/.
Why Google's Code? As it can be seen, Google is an extremely fast-developing company, which provides more and more services with each passing day. And, as it can be noticed, all of these services are of a high quality and are highly integrated into the composite system. So, it is believed that by coming together with Google, the project will have a good place to live.
Accessing source codes and stable releases
The Google's Code project uses Subversion as a source control system, which is very nice and convenient to use. To get access to the project repository you may use command line utilities, as different client utilities have a great selection. As for me, I prefer to use the Tortoise SVN client, which integrates with Windows Explorer and provides access to the repository through a nice GUI.
To keep up-to-date, you may check the repository's log from time to time. This will provide you with information about all the latest commits -- their time and their notes -- which describes what exactly was changed/added in that particular commit.
If you notice any changes in the working repository and you are brave enough or wiling to get the critical update for your interest, then you may receive the freshest sources snapshot from the working folder of the project (trunk). But if you would like to be sure about what you receive and you would like to have a more stable release, then it is advisable to download the project from the latest know tag. To navigate through the project repository and learn about its structure and available branches or tags, you may use the Repo-browser utility, which is accessible from explorer's context menu:
Submitting issues ans requests for new features
As it usually happens, the project is not ideal and may have different bugs or issues, which should be reported and put into the fix queue. Also, it is obvious that many different people may want to get an extra feature or have some other request for extending the project. To provide the centralized storage of all these issues and requests, the project has an issue tracking system that allows the submission of issues of any type, marking their type, priority and area. The issue tracking system is also available from the project's home page and is provided by Google's Code:
Please, if you find a bug or you have an extension request, try to submit it through the project's issue tracking system, not through CodeProject. This will make it much easier to track all of these different issues and they will be stored in centralized storage, so it will become possible to search for an issue and its status before submitting it again and again.
Participating in the project
Do you have an interest to the project? Do you want to participate in its discussions? Do you want to become a member of it? If so, you are more than welcome in the project's discussion group.
Conclusion
Working on different parts of this project, I learned a lot in many different areas. The project became not just a hobby for me, but some of its parts were used in my bachelor degree work, as in various research work and projects. Combining all of this together and sharing it as an Open Source project also gave me new experience in project managing, controlling and organization. I hope that the project will be found interesting to more than just me and that it will be used by many people. I hope that some will even join it, thus extending it with new ideas.
History
- [18.05.2007] - Article moved from C# Algorithms subsection "Neural Networks" to subsection "General"
- [16.05.2007] - Article edited and posted to main CodeProject.com article base
- [23.02.2007] - 1.1.0 version
- Project converted to .NET 2.0
- Completed merge with older stand-alone versions of Imaging and Math libraries;
- Sandcastle is used for documentation generations
- [22.12.2006] - Initial publication of the article, 1.0.0 release of the framework and official opening of the open source project