Introduction
Rev Warrior is designed to help automobile performance enthusiasts get the most out of their cars or trucks. At the most basic, it is a data collection tool which will utilize the sensors in an Ultrabook and (optionally) an OBDII connection to gather and analyze performance data to help tuners squeeze the most out of their vehicles. While Rev Warrior will take advantage of a touchscreen interface and use WPF to look like a Metro-style app, it will be a full desktop app because of the requirement for OBDII connectivity. The serial APIs needed to communicate with the external hardware are not available on the WinRT stack.
Background
There are a handful of apps for performance tuning on Android and iOS but they are limited in their functionality. The OBDII features are sometimes not fully implemented or are hard to use, there is typically no ability to start a "performance run" and record its data, nor any ability to compare that data from run to run to get a picture of how changes to the vehicle's configuration impacts performance. As someone who has tried to use some of these apps, I found them useful but I quickly ran into the limitations of each. And in their defense, mobile devices are limited so a full-fledged, full-featured data acquisition application is hard to do. Hence the beginning of an idea.
When I initially thought of this in early 2012, one of the main problems was the gathering of sensor data independent of the vehicle. I toyed with the idea of a mobile app that would use the sensors in a phone to pass the data to an app running on a laptop. But that was complicated and there would be difficulty in matching up the data points as there would be no 'master clock'. Adding a sensor package to the laptop would solve the problem and I looked at the Netduino as a possible solution. But I'm not in the hardware business and while I won't rule something like that out (to support a laptop in the future) it created a higher barrier to entry than I was willing to climb. Enter the Ultrabook... with the power of a laptop (so a full-featured app could be written) combined with the built-in sensor support. Finally a solution to my problem and the vehicle to see this idea turn into an actual application.
Basics of the App
This is only the initial design document for the application and I would invite your questions, comments, and suggestions. Input from others will be vital to make this a useful app and as the application develops, this article will evolve.
The app will be divided into live and analysis sections. In the live section of the application the user will be able to see the current performance parameters, manage and view the OBDII data if an adapter is present, setup the performance run data acquisition, and view live data on a configurable dashboard. The dashboard will have an option to mirror the image in the event the user wants to put the device on the dash of the vehicle and view the information in a 'heads-up' display mode.
The focus of the application will be the performance data acquisition. The user will be able to setup the parameters (including setting up and managing multiple vehicles) to collect for a run and then "arm" that run. Once armed, the software will begin logging the performance parameters of the car into a circular buffer. If only the Ultrabook sensors are available (no OBDII interface), this will be limited to acceleration, orientation, and gps data. The app will buffer the performance data until it sees a significant change in one or more parameters indicating the start of a run. (i.e. - the driver smashed the gas pedal) The buffered data as well as the current data will be written to a run file. (This allows us to capture data prior to the start of the run so we don't miss anything) When the vehicle returns to 0 speed or the user presses a button to stop acquisition, the software will stop recording the data. Immediately, the user will be able to switch to an analysis view to visualize the performance data on graphs showing things like acceleration, 0 to 60 time, roll, braking distances, etc. GPS data points will allow the path of the run to be plotted and the data associated with that point to be inspected. If an OBDII connection exists, the engine performance parameters (throttle position, timing offsets, boost, etc) will be recorded as well. The date, time, and location of the run will be recorded as will the current weather conditions if a data connection is present and the conditions can be retrieved. The user will have the ability to attach notes to the run as well to keep track of what they might have changed and been testing at the time.
If a data connection is present, the user will be presented with an option to save the performance data to cloud storage. This could be Skydrive, Google Drive, Box, etc. The advantage to this is that if the Ultrabook were to be accidentally damaged during performance testing (because someone forgot to belt it in) then the data is still safe and sound and ready to be loaded back onto the replacement Ultrabook!
The analysis section of the application will come into play when it is time to compare one run against another. By choosing and filtering the previous runs that will be compared, the user will be able to get a handle on the performance gains or losses from one run to the next. Overlaying runs on the same graphs will give the user the ability to quickly see where their tuning efforts are working. Swiping back and forth between graphs will display different parameter comparisons while the ability to pinch out will switch to a 4-graph view that allows the user to see 4 parameters of their choosing at one time.
The application will not make suggestions to the end user. Analysis of the data and what it means to the performance of the vehicle in question will be up to the user.
Another key area of frustration is the OBDII system and it's implementation in software. Other apps I've used expose this data but usually in a not-very-user-friendly-and-rather-cryptic way. With some time and thought it can be done much better. However, the OBDII connection and its protocol is rather complex so support for this aspect will have to develop within the application. Initially, the software will support reading of the standard OBDII parameters. Sometimes, in tuning a vehicle, it is helpful to be able to edit the values in the car's ECM via OBDII. Commercial scanners can do this and Rev Warrior should be able to do it as well provided the OBDII interface hardware supports it.
Newer vehicles also use a CAN interface on the OBDII connector to connect with the vehicle's CAN bus. Most advanced functionality like body control, ABS, etc are on the CAN bus and this is a common feature on high-end commercial scanners. The ability to read these parameters and error codes from the CAN bus will also be supported within Rev Warrior provided the OBDII hardware supports the connection. This feature may take some time as each manufacturer creates their own codes and there is very little standardization.
Points of Interest
The OBDII connection is, in my opinion, the most interesting part of the application. Which is funny considering that I was initially worried about the acceleration, gyroscopic, and GPS sensing. While the process of enumerating the available sensors and collecting the data from them is still interesting, the availability of these in the Ultrabook form factor has taken away the complexity. Now, the OBDII connection is where the rubber meets the road. While physical performance data like acceleration, distance, and time are all important, what the performance tuner really needs is that view into the ECM and the engine... to know what the innards of the vehicle is doing while the run is going on. By correlating these different data aspects, the tuner can zero in on where changes will gain the most performance. Modern vehicle ECMs expose a ton of data from all of the sensors onboard the vehicle (and there are a lot of them!). Gathering this data, recording it, and exposing it for consumption really is the key to getting the most out of the vehicle. Fortunately, once the connection is made to the ECM, identifying the available data points then asking for that data isn't difficult. Ultimately, to get the OBDII part working, the software will have to connect to the interface device then connect via the device to the ECM and enumerate the available data. Easy... right?
Project Updates
November 5, 2012 - One of the features that is important to me in this project is the ability to save the performance data to Google Drive or to Skydrive. I had some downtime and was able to start looking into these two aspects of the application. Skydrive seems to have an easy to use API that is, not surprisingly, well integrated into Windows 8. So I figured I would come back to that and look at it later. I then turned my attention to the Google Drive API. I found that Google has a library of .NET components to help access Drive but their documentation wasn't that great. The samples showed how a very basic console app that required authorization each time you ran it. Not a very ideal situation so I started looking around. It took 4 days of on and off effort (mostly on) to finally get it all sorted out. I wrote the process up in an article here on working with Google Drive in WPF. The process there can be applied to other Google services like Tasks, Calendar, etc. Once you get past the authentication, their libraries make using the services a breeze. It was getting past the stupid authentication that proved to be the headache.
The Final Application - December 3rd, 2012
So I did get the app done and submitted in time for Round 2, however, after 8 days of waiting the app was rejected (more on that in a minute) and despite fixing the issues and resubmitting, I didn't hold my breath given that the deadline was 3 days away and they took 8 days the first time. I was not disappointed... it is now Dec 3rd, two days past the deadline for judging and my app still hasn't been approved. Am I disappointed? Absolutely... who wouldn't be? But I'm going to give a rundown of what I did and what I learned. Maybe it will help someone in the future.
So I was actually really pleased with the application I submitted. The time frame for the contest forced me to make some sacrifices. I even took three days off from my regular job so that I could get Rev Warrior finished by the deadline. In the end I had to omit some key features, specifically I wanted Google Drive and SkyDrive integration. I had to cut that. I wanted more ways to analyze the captured data and compare it to previous runs, I had to cut that as well and deliver only a handful of performance graphs. I also wanted OBD II connectivity and, although it was delivered in the original submission, I had to cut it from the follow-up.
Touch and Gestures
I also wanted the UI to have the feel of the panorama control that is present in Windows Phone. I really like that paradigm as it is very intuitive for the user to swipe left and right to move to different areas of the application. In the initial design, I was going to have three panels... one for instrumenting the sensors and capturing the data, one for analyzing the data, and one for interacting with the ECU (Engine Control Unit... the brains of the car). Because of the lack of serial support in WinRT, I was forced into creating a WPF desktop app. Much to my utter disappointment, there is no panorama control in WPF. Some people have tried to replicate it but I found they didn't meet my requirements or were too complicated to use in the time that I had allotted. I ended up using pages and transitions with buttons on each side that the user could touch to move between panels. I didn't really like it but it worked. I learned that touch and gesture support in WPF is really lacking. If someone comes out with a good pano control, I will fix the UI because I'm just not happy with the way it is.
The other thing I found was that the WPF controls expose different events for different means of actuation. For example, the lowly button expose left mouse down and up, right mouse down and up, touch down and up, stylus down and up, and keyboard down and up. It was a real pain when I had to hook all those events to determine that a user was pressing and holding the record button in the application. Button Pressed and Button Released would have been really useful aggregate events... I don't care how the button was pushed, just that it was. In a touch-enabled world, and especially on the hybrid-input Windows 8 devices, there is a chasm between the Click event and all of the other events. And by the way... you have to use the preview version of those events in xaml or the code behind... otherwise you hook the event handlers and nothing happens. Presumable because Click is being raised first and is setting the Handled flag to true.
User Interface Design
I have to give the UI credit to my brother. He is artistic, I am not. And he graciously concepted the design for me. Here is a screen shot of what he came up with after I implemented it:
Note that this is what I originally submitted to Intel. You'll see what I ultimately submitted (and failed to get approval by the deadline on) in just a sec. I think his idea was a home run. The idea was to create a dashboard feel and I think he really came through. The gauges all reflected live data until you arm the recording system by pressing and holding the arm button. When that happens, I unhooked all of the event handlers that updated the gauges... essentially "unwiring" them. The purpose of doing that was so I was not hindering the data capture by processing unneeded UI events. Besides... the driver should be looking at the road when they are doing a performance run... not the Ultrabook.
When the run was over and the user pressed the record button again to stop data capture, the events were hooked back up and the system would start displaying live data again. If the user pressed the buttons on the left or right side to transition to the analysis page, the program would slide the
gauges out and bring the graphs in. The graphs would auto-scale based on the number of data points in the capture file.
I would like to adjust the pitch/yaw/roll gauges to use a vehicle rather
than just a needle. Something like the virtual horizon in an airplane
control panel. I think that will help clarify those gauges so I'll do
that in a later version.
Capturing the Data
One of the things I was most proud of was the data capture. If you think about it, this was quite a
challenge. I needed to be able to capture data from an arbitrary number of sensors at a reasonable capture rate for a length of time that was undefined... all while making the data self-describing so it could easily import into Excel or some other application for later processing and analysis. The other dimension of this was that I needed to correlate the data from multiple sensors with each other in time and keep them in order as it was outputted to a capture file.
When it comes to storing data in a manner that can self-describe I love XML because of the XMLSerialization classes. They make it so easy to get data in and out of a file since it is a very portable format, and it also made it easy to satisfy the requirement that the data could be pulled into Excel without much effort. So the result was a serializable "Sensor Logging Class". This class provided three public members; a sensor name, the sensor data type, and whether the sensor is actually available to us. There are also two public members which returned the sensor data value. The class looks like this:
namespace RevWarrior.Models
{
abstract class SensorLoggingBase : INPCBaseClass, ISensorLogging
{
protected bool _logSensorData = true;
public event EventHandler DataChanged;
public abstract string SensorName { get; }
public abstract Type SensorDataType { get; }
public abstract bool SensorAvailable { get; }
public virtual bool LogSensorData
{
get
{
return _logSensorData;
}
set
{
if (_logSensorData != value)
{
_logSensorData = value;
NotifyPropertyChanged();
}
}
}
public abstract object GetSensorData();
public abstract SensorDataObject GetSensorDataSnapshot();
public virtual void OnDataChanged()
{
EventHandler dataChangedEventHandler = this.DataChanged;
if (dataChangedEventHandler != null)
{
dataChangedEventHandler(this, new EventArgs());
}
}
public virtual double BoxNullableDouble(double? nullableDouble)
{
if (nullableDouble == null)
{
return 0;
}
return (double)nullableDouble;
}
}
}
The INCPBaseClass
is just an INotifyPropertyChanged
class that provides the
INotifyPropertyChanged
implementation and support code. You might see that in
NotifyPropertyChanged
I don't pass the property name. This is a nice feature of .NET 4.5... there is a compiler macro to do that for you. That method now looks like:
public void NotifyPropertyChanged([CallerMemberName] string propertyName = null)
(How awesome is that? No more passing and managing the property names! )
Each sensor logging object would be assigned to a sensor as the sensor classes were built. If the sensor existed, the logging object would be added into a Dictionary collection. This turned out to be a problem when I went to serialize the data to disk... dictionaries are not serializable... but I actually found a
SerializableDictionary
implementation posted by Dacris Software that fit the bill to a T... (<Type T> to be exact)... and solved that issue. (There is a
SerializableDictionary
in the BCL namespace but it looked like support might be sketchy so I avoided it)
So now I had a dictionary of sensor data objects inserted by sensor name. This would allow me to get each sensor, in order, and grab a current reading. So what do I do with all of these readings? At some point in time, let's say 1 second into the grab, I pull readings from maybe 15 sensors... what do I do with them? The answer was actually to wrap the dictionary in a class that would add a timestamp and put that into a linked list. The result would be a linked list of data points which each had a timestamp and a dictionary of the data points. Linked lists are great because adding nodes at the end is an extremely fast operation. The number I could acquire would be limited by the system memory. And since a linked list is really easy to walk, writing the list out to disk would be very simple.
The results were awesome. When I ran it full blast I got a very respectable 5000 data points per second... or 5kHz... on the contest Ultrabook. And that was in debug mode running from Visual Studio. That blew my mind and made me realize I quickly needed to add a timer to slow that down. I ended up implementing a dynamic pseudo-timer which would allow the user to setup an acquisition period, say 10 points per second, and the app would start by timing how long each batch of sensor grabbing took. The difference between the actual time and the ideal time was loaded into the timeout period for a Thread.Join operation which was watching for the termination event anyway. The result was that I got a method that would wait on either the ideal timing window or the thread termination event. It would like a charm and I could vary the sampling period up to the max as I saw fit.
Once data acquistion was done, the linked list of data was passed to a method which would write the data to disk. In actuality, there were two linked lists. One (which always inserted new nodes at the beginning) was the buffer and recorded 500 data points prior to the acceleration sensors detecting motion. When the acceleration threshold was reached, the data acquisition thread would switch to the recording linked list which would add points to the end until the recording stopped. Prior to being passed to disk, the buffer linked list was reversed into a final run list and the recording list was appended to the end of that list. The result was a final, single, linked-list which represented the 500 data points prior to acceleration and then all the data points after acceleration. This was written to disk a record at a time, which included a timestamp and the individual sensor data. The result was an XML file of the data acquired during the run that would import directly into Excel.
OBD II
One of the key features was to be the ability to capture OBD II data at the same time and correlate it with the data coming from the sensors. This makes Rev Warrior a really useful tool compared to many other solutions. Traditionally, this has to be done with an expensive outboard sensor pack because Laptos didn't have the sensors in them. That is where I thought the Ultrabook could really make a difference. Being able to have a device with built-in gyro, accelerometer, gps, etc which can be correlated to throttle position, speed, timing advance, etc would make a really powerful (and much less expensive) package. I still believe this is the case but I was forced to hide OBD II from Intel when I re-submitted.
Why? Because the Intel evaluators apparently don't understand what "optional" means when talking about optional hardware versus required hardware. They failed my app submission because I didn't provide them with the "required" hardware needed to test the application. There was good reason for this. The application was written to gracefully fall back if OBD II connectivity wasn't present. I thought I had done a pretty good job of that. The dials would gray out with a message saying that the hardware wasn't present.
Also, it wasn't practical to send Intel the hardware. First, the time frame of the contest didn't allow it... it took them 8 days to evaluate it the first time anyway... how much longer would that have stretched if I had sent them the hardware? Secondly, the hardware is expensive. The PLX Devices Bluetooth adapter is $100... I couldn't afford to send that to them never to see it again. Besides, were they really going to go plug an Ultrabook into someone's car to see if the adapter really worked? And so what? Isn't it my responsibility to test the app? If the OBD didn't actually work when plugged in, who are they to say? Their submission guidelines don't specify that the app must comply with how I think or how they think it should work!
The fix was embarrassing, I had to hide the OBD support from Intel. I couldn't remove the library because it was built-into the app so I simply hid the dials and the buttons that allowed you to get to the hardware setup screen and told Intel that I had removed support. That was a big kick-in-gut because I felt like that would be key in the judging of the application... the synergy of internal and external sensing was really what I was going for to show off what could be done with an Ultrabook. But I felt like getting something in was better than getting nothing in and I hoped that they would approve the app in time if I just removed the offending functionality.
(Note the missing gauges... they were driven by the OBD II interface)
I fully intend to continue to develop this app and take it commercial. OBD II functionality will be going back in and if Intel wants to force me to send an adapter to them, I'll find a more friendly app store in which to sell my application.
I should also note that I didn't re-invent the OBD II wheel. The fine people over at Channel 9 have re-envisioned a Mustang and used OBD II to create a really cool electronic dashboard for it. Search for 'Project Detroit' to see and admire their work. As part of that project, they wrote a great OBD II library. Really, they made my project possible in the time frame allocated because the actual decoding of OBD II data isn't that simple, and they took a lot of that overhead away.
I did find, however, that their library was specific to Ford OBD II connections and I suspect that it may even be specific to certain models in Ford's line. The Project Detroit team only implemented 2 of the 12 protocols so when I connected the library to my Chevy Trailblazer and my Buick LaSabre, it fell on its face. On the upside, I'm learning a lot more about OBD II than I knew before I started this and I think I will be able to patch the library to support all 12 protocols.
WinRT on Windows Professional
So one of the nerfarious little things that I had to do in this project was figure out how to consume the WinRT stack in a desktop application. The Device.Sensors namespace lives in the WinRT stack and although the Microsoft documentation says that you can use WinRT (at least a subset of it) in a full desktop app, they don't bother showing you how to actually do it. Google to the rescue but I must say that this was WAY harder than it should have been. I could have easily seen someone's contest entry getting derailed by this. Locating the proper libraries and the famous (or rather infamous) *.winmd file(s) plus knowing how to unload the solution and make it Win 8 only proved much more frustrating that it should have been. I'll probably write an article on here on how to get it working because there really seems to be a vacuum of knowledge there.
Lessons Learned
So I took a lot away from this competition. More downsides than upsides, which is unfortunate. On the upside, I actually forced myself to write a complete app from blank solution to runnable app in about 6 weeks. (Notice I didn't say finished app...) As mentioned, I fully intented to finish this application and try to make it something compelling that people will actually buy. As part of this project, I looked at other solutions and what I found only encouraged me that there is a place for my application in the market. The challange will be to keep the intensity up and to try to carve out time to continue to work on it and refine it. So even though I'm disappointed that I apparently missed judging because of the approval process, and I'm not happy with the current version, I do believe I have something I can build on and that does make me happy.
Comodo
I learned I was not happy with Comodo. When I started the competition, I thought it would be great to get a signing certificate for no cost. I sort of felt like I got tricked into a bait and switch. Comodo was not clear about what the requirements were or what the process would be when trying to get the certificate. The only options were a personal certificate and a corporate certificate. The personal ticket would include my address and my phone number... that has be to unacceptable to pretty much everybody. Why would you put that information out on every app that you post? So the only option was the company certificate. The problem here was that despite the fact I had worked as a sole proprietorship for years, that was not good enough for them. It had to be a registered corporation. So I begrudgingly parted with $125 USD to file for an LLC in the state of Ohio, setup a bank account in the company name, and submitted all the paperwork to Comodo. At that point they told me I needed a valid phone number that was listed in an on-line database. Where was that in the requirements listing? Fortunately, I have an Asterisk-based phone system at home so a quick PAYGO phone line, some scripts, submission to SuperPages, and in 24 hrs I had a "listed" phone number. Then they needed to call me... I started to wonder if it would ever end. Fortunately, I got the certificate but I was mad that I was out all that time and money for something so trivial... especially when I started looking at other company's offerings and found their approval processes much better defined and quite a bit easier.
Intel AppUp
I was really excited about submitting to the Intel AppUp store. I setup my account and dashboard early and had used their resources in working on the app. Their documentation was helpful on the WinRT problem I mentioned earlier as well as several other key areas where I needed to learn things.
However, when it came to the submission process, I wasn't happy. The day after I submitted my app I realized there was a bug in the MSI and that there were no sortcuts being created when the app was installed. Easy fix... but there was no way to update the binary I had uploaded for approval. As soon as I had uploaded it, I was doomed and didn't even realize it. My app was sitting in purgatory until they could actually get to it but that didn't matter, I couldn't touch it and fix the issue that I was pretty sure would fail it. And I was right... the missing shortcuts were one of the reasons it was failed the first time. I don't understand why I couldn't get to it between when I submitted it and when they actually got it. Ultimately it would have saved their time (by not reviewing an entry I knew would fail) and my time and my entry in the contest. But I'm not bitter about this...
Of course, as I mentioned before, the hardware was a sore spot as well. When the app launches the first time, it asks the user what type of OBD II hardware they are using. The FIRST SELECTION is NONE. So why does Intel think that the hardware is required for the operation of the program? Clearly they have not thought this one out... if the hardware was REQUIRED to run the app, then I could see the hardware being in the tester's hands a critical requirement. But when the app runs just fine without the hardware, why should they fail the app if they don't have the hardware? It's really my problem if the app doesn't run when the hardware is connected. Either they are approving these apps for sale or they are debugging and testing them... in which case they should be up-front and say that is what they are doing.
Intel and I will be going a round 2 on this... once I get past this second submission, I'm adding the hardware support back into the app. As I mentioned, I'll go somewhere else if they want to hassle me about optional hardware support. (Note that on the submission forms, you are asked about mandatory hardware... which an OBD II adapter is not mandatory for my app to run... so I didn't list it.)
The other thing that really annoyed me was the AppUp SDK. To integrate it into your app, you include the libraries and add some code. However, as soon as you do that, you have to have a listener proxy that will 'fake' the validation of your app. Of course, when you are debugging your app ID has to be passed as a gazillion 1's but when you are building for release, well it has to be your actual app GUID. So there is the rub... the real
GUID doesn't work until some mystical deity at Intel says that it works. So could I actually install and test my app, as submitted to Intel? Nope. Because my real
GUID wasn't "live" according to the listener proxy. That is just plain silly... I get the
GUID when I create my app in the dashboard... why can't the guid work at that point? Why does the app have to be published for the
GUID to work? It basically forces me to submit the app blind... not sure if the SDK integration is actually working. So if I add the AppUp SDK to my app for integration with the store, I have to use the dummy ID then remember to change it before I submit it (don't want to waste a week because you forgot something silly like that) and just trust that the
libraries will work properly and not blow my app up when it launches with the real ID. Again, this just reeks of not being
thought out well.
Overall I was happy that Intel sponsored this contest and I got an Ultrabook so it is hard to knock those guys. But at the same time, when you get knocked out of the running because their process isn't adaptable, or because they misinterpret a feature, that is a hard pill to swallow.
The Contest In General
I love that Code Project finds sponsors and offers these contests. Sometimes some really great stuff gets pulled out of the woodwork because there is a motivator (cash and prizes are a great motivator) and I think that makes them worthwhile.
What bothers me is that I see in this contest the same thing that frustrated me when I participated in the Nokia Dev Challenge for Windows Phone. In both cases, the timeline for working on the app was seriously compressed. (For the Nokia event it was only about 9 hrs) The contest doesn't stipulate whether the work must be an original, non-published submission or if existing apps can be submitted. So what has become clear to me is that the people that have an existing app and simply port it or adapt it to meet the contest parameters have a clearly unfair advantage. (I watched an app win the Columbus, OH Nokia challenge that the guy admitted in the presentation was a product of his company and that they had been working on for months. He showed up there and did some coding and testing on a Nokia phone so he qualified for the prizes. Meanwhile, I sat there embarrassed with a half-finished app I had slaved over for 9 hrs straight.) These people could have months or years behind their app refining it and see an opportunity to throw it in a competition and gather up some gold. Meanwhile, those of us that are (in my opinion) adhering to the spirit of the competition end up scrambling, cutting features, and skipping key steps like testing to make the deadline and get something in because something is better than nothing. When in reality, we stand absolutely no chance against the polished apps because our stuff looks like some fly-by-night submission compared to their highly-developed and refined applications. So just like at the end of the Nokia challenge, I'm left holding onto a half-assembled embarrasment while the people that had a 3 year head start on me collect the prizes. I will likely never again participate in a competition like this because I feel like I'm playing by the rules and starting at the start line while another group of people are starting 3 feet from the finish line... so there is no possible way for me to beat them.
The other thing that became clear in this process is that the cards are stacked against the individual/independent developer... And I think it has actually gotten worse, not better, in the last year. The focus on app stores and mobile devices and all is great and I think it is awesome that there are lots of tools to get you going in those areas. But desktop development has taken a back seat to these other areas and it makes it tough on a single developer, like me, to be able to create a successful application. Case in point:
- Visual Studio comes in the Express version. Awesome... the tools are there for free... But wait, it isn't hard to run into the limiations of the free version. As an MSDN subcriber, I have VS Professional... but what if I didn't? What if I didn't get that subscription as a benefit of my regular job? I understand MS wants to differentiate their products and drive different revenue streams through different features, but the various versions only serve to isolate the development community between the various levels of the haves and the have nots.
- The removal of setup projects was the most stunning slap in the face MS could have done in VS 2012. At the time I thought, 'oh... no big deal... I don't use them much.' But that is because I primarily work in Silverlight in my regular job. When it came time to package my application for submission, I found that the only viable option for a simple deployment is Install Shield Odd-Lots Edition... which you have to sign-up to get. It is an extremely crippled version of the software lacking basic abilities like specifying the install type (minimum, standard, custom) and the ability to adjust the script or even specify which type of installers will be created. And don't bother looking at any of the good versions of the software if you want to use any of the crippled features. In the US, the Express version is $650... way out of the budget of anyone writing software on their own. And it gets worse... want to keep it up to date, well, that will be a $1500 maintenance plan. So what do people do? Create the solution in Visual Studio 2010, add the setup project, then open it and work in VS 2012 going back to 2010 when you have to package a deployment. Just silly...
- Protecting your code is the other farce in the market. Preemptive will give you an evaluation version of their obfuscator but it is limited in what it will do. Is some protection better than none? Probably... but even after using the eval version to scramble my code, I still feel like it is going to come back in a month and a half to tell me it is pregnant. So how much does real code protection cost? Well, normally it will cost you Ulysses S. Grant less than $5000 but if you act now, through their special marketing offer for single-developer operations, you can have it for only $1250... but that is a limited time offer. Yea... thanks but no thanks... I still need a house and food for my kids. There are other options, like Eazfuscator.net but it went commercial this past summer and at $400 it is pretty steep for a single person as well. There are some open source projects but nothing is up to snuff yet. So my app will go into the app store with little protection because I'm not rich enough to shell out for real code obfuscation.
I understand the need for all of these companies to make money on their products. I, too, am a fan of making money on my efforts. But there is a lack of attention placed on the single/small indie developer. These tools are available to big companies that can afford the costs but the rest of us, sort of in-the-trenches trying to innovate, are left with our hat in hand. That is a shame. And all these tools companies talk about their committement to the development community but that commitment comes with a price... and that price starts several floors above where someone like me can get to.
Ultimate conclusion
So am I disappointed? Hell yeah. I poured 6 weeks of my life into this competition only to get denied entry at the door because the bouncer doesn't like the color of my socks. That sucks and there is no way around it. I will be bitter about it for some time to come. The only upside is that I got an Ultrabook (which, by the way, is great) and I have an app I can continue to build on and maybe "win" in the open marketplace. I am, however, now completely turned off by coding challenges and think it will be a cold day in hell before I enter another one.
Feel free to post your comments and experiences to this article. Did you have a similar experience? I'd be interested in knowing.
History
Oct 17, 2012 - Initial revision
Nov 5, 2012 - Added update on Google Drive and linked to the article I wrote on using it from WPF.
Dec 3, 2012 - Added final project notes and rants.