Altering my perception

My apologies for not posting for a while; it’s been a pretty crazy couple of months and it’s just about to get a whole lot crazier. For those who aren’t aware, Intel® have started running coder challenges where they get together people who are incredibly talented and very, very certifiable and issue them with a challenge. Last year they ran something called the Ultimate Coder which looked to find, well, the ultimate coder for creating showcase Ultrabook™ applications. This competition proved so successful, and sparked such interest from developers that Intel® are doing it again, only crazier.

So, Ultimate Coder 2 is about to kick off, and like The Wrath Of Kahn, it proves that sequels can be even better than the original. The challenge this time is to create applications that make use of the next generation of Ultrabook™ features to cope with going “full tablet”, and as if that wasn’t enough, the contestants are being challenged to create perceptual applications. Right now I bet two questions are going through your mind; first of all, why are you writing about this Pete, and secondly, what’s perceptual computing?

The answer to the first question lies in the fact that Intel® have very kindly agreed to accept me as a charity case developer in the second Ultimate Coder challenge (see, I can do humble – most of the time I just choose not to). The second part is a whole lot more fun – suppose you want to create applications that respond to touch, gestures, voice, waving your hands in the air, moving your hands in and out to signify zoom in and out, or basically just about the wildest UI fantasies you’ve seen coming out of Hollywood over the last 30 years – that’s perceptual computing.

So, we’ve got Lenovo Yoga 13 Ultrabooks™ to develop the apps on, and we’ve got Perceptual Camera and SDK to show off. We’ve also got 7 weeks to create our applications, so it’s going to be one wild ride.

It wouldn’t be a Pete post without some source code though, so here’s a little taster of how to write voice recognition code in C# with the SDK.

public class VoicePipeline : UtilMPipeline
private List<string> cmds = new List<string>();

public event EventHandler<VoiceEventArgs> VoiceRecognized;
public VoicePipeline() : base()



public override void OnRecognized(ref PXCMVoiceRecognition.Recognition data)
var handler = voiceRecognized;
if (data.label >= 0 && handler != null)
handler.Invoke(new VoiceEventArgs(cmds[data.label]));
base.OnRecognized(ref data);

public async void Run()
await Task.Run(() => { this.LoopFrames(); });

As the contest progresses, I will be posting both here on my blog, and a weekly report on the status of my application on the Intel® site. It’s going to be one wild ride.


7 thoughts on “Altering my perception

  1. Nicholas

    Sounds like a lot of fun 🙂

    Can you share something about what your application is going to do and which elements of perceptual computing you’re using for it? 🙂

    I wish you good fortune for the contest. I hope no nasty problems block your road 😛


    1. peteohanlon

      Thanks Nicholas. There’ll be a blog post on the Intel site on Monday giving more details about the application. In brief, though, it’s a photo editing application called Huda. I know – big yawn, it’s been done a thousand times before. Huda is different though – most photo editing applications perform destructive edits; in other words, once you’ve applied a filter and saved it, the original image is lost.

      What I’m proposing with Huda is that all edits can be modified, moved around, removed, copied, and so on – and the original image is still preserved. As this is a Perceptual Ultrabook app, there’s going to be a ton of touch based activities – combining touch with voice commands should make for a great experience; one of the early features I want to put in place is touching the screen at two points and telling Huda to crop the image. I’ll be using gestures such as swipe to scroll through images, reach in to zoom in and pull back to zoom the image out. There’s a whole lot more I want to put in, and more details will emerge as I go through the competition.

  2. So Pete, glad you are well. Really pleased that you are entering into the fun volley! I had tinkered with the Kinect and that was fun but being a LOB developer I found it hard to come up with ideas that this could be used for. Well, apart from creating a menu system similar to the minority report or making CEO’s have fun with presentations etc…

    I am keen to follow how you get on. Do I take it this is geared around WinRT and xaml (please say yes to xaml… had enough of html)

    1. peteohanlon

      The first post next week details more about what I’m planning, but yes it’s XAML. It’s WPF though, not WinRT as the Perceptual stuff is so radical and not really supported by WinRT.

  3. Simon Whale

    please post a link to your blog on the Intel site as the application does sound exciting and like db7uk I am keen to see how you progress.

    I am also good that it is being done in something other than WinRT (even though I do like Windows 8), just getting bored with the hype around the WinRT technology stack.

    1. peteohanlon

      Simon, I’ll be reposting the Intel blog here on my site as well. I know what you mean about the WinRT stack. While I like some of what it has in there, currently it has far too many limitations for me to want to heavily target it. There are too many things that I take for granted in WPF that just aren’t there yet.

Leave a Reply to peteohanlon Cancel reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s