My apologies for not posting for a while; it’s been a pretty crazy couple of months and it’s just about to get a whole lot crazier. For those who aren’t aware, Intel® has started running coder challenges where they get together people who are incredibly talented and very, very certifiable and issue them with a challenge. Last year, they ran something called the Ultimate Coder which looked to find, well, the ultimate coder for creating showcase Ultrabook™ applications. This competition proved so successful, and sparked such interest from developers that Intel® is doing it again, only crazier.
So, Ultimate Coder 2 is about to kick off, and like The Wrath Of Kahn, it proves that sequels can be even better than the original. The challenge this time is to create applications that make use of the next generation of Ultrabook™ features to cope with going “full tablet”, and as if that wasn’t enough, the contestants are being challenged to create perceptual applications. Right now, I bet two questions are going through your mind; first of all, why are you writing about this Pete, and secondly, what’s perceptual computing?
The answer to the first question lies in the fact that Intel® has very kindly agreed to accept me as a charity case developer in the second Ultimate Coder challenge (see, I can do humble – most of the time, I just choose not to). The second part is a whole lot more fun – suppose you want to create applications that respond to touch, gestures, voice, waving your hands in the air, moving your hands in and out to signify zoom in and out, or basically just about the wildest UI fantasies you’ve seen coming out of Hollywood over the last 30 years – that’s perceptual computing.
So, we’ve got Lenovo Yoga 13 Ultrabooks™ to develop the apps on, and we’ve got Perceptual Camera and SDK to show off. We’ve also got 7 weeks to create our applications, so it’s going to be one wild ride.
It wouldn’t be a Pete post without some source code though, so here’s a little taster of how to write voice recognition code in C# with the SDK.
public class VoicePipeline : UtilMPipeline
{
private List<string> cmds = new List<string>();
public event EventHandler<VoiceEventArgs> VoiceRecognized;
public VoicePipeline() : base()
{
EnableVoiceRecognition();
cmds.Add("Filter");
cmds.Add("Save");
cmds.Add("Load");
SetVoiceCommands(cmds.ToArray());
}
public override void OnRecognized(ref PXCMVoiceRecognition.Recognition data)
{
var handler = voiceRecognized;
if (data.label >= 0 && handler != null)
{
handler.Invoke(new VoiceEventArgs(cmds[data.label]));
}
base.OnRecognized(ref data);
}
public async void Run()
{
await Task.Run(() => { this.LoopFrames(); });
this.Dispose();
}
}
As the contest progresses, I will be posting both here on my blog, and a weekly report on the status of my application on the Intel® site. It’s going to be one wild ride.