This week I got the Intel® RealSense™ Camera and I've started creating the Visual Studio extension for the Intel® RealSense™ AppChallange.
First, I've installed the Intel® RealSense™ SDK which has some very nice samples for both C++ and C#.
Then I've installed Visual Studio 2013 Community edition (Thank you Microsoft!) and then VS 2013 SDK.
All tools are now in place!
The Natural Interaction Visual Studio (2013) extension is designed to be initialized when a solution is loaded - here the devices are discovered and entities initialized (e.g. SpeechRecognition). Once this happens, the user can go to the menu TOOLS and then Options... to view the Natural Interaction extension settings. In my case, I used a small vocabulary file for the "Start debugging" and "Stop debugging" sentences.
After some hours of coding, I've come to the following (promising) result: http://screencast.com/t/uvvmbfBHXkrQ.
This is Demo 1 - more commands (like Step into, step out) will follow. Also, the gestures are to be expected in the near future... but that is for another post!