Oculus Avatar SDK with C++

In my post Oculus Touch controllers with C++ I programmed these elegant Oculus Touch controllers with some C++ code:

virtualrealitydevelopmentwithoculusrift_touch

When the left Touch controller is held at a certain height and its hand trigger is pressed, a cone appears in our virtual world:

oculusrift_touchcontrollers_cone

The controller will also vibrate, providing haptic feedback.

If that was not enough, I also coded the thumbsticks on the Touch controllers so I could stroll about the virtual world.

But something is missing? I can’t quite put my finger on it (sorry about that).

Ah, of course! What we really need is to see a pair of virtual hands in our pixel world. Yes, virtual hands that will mimic the exact position and movement of our real hands and fingers.

As it happens, the Oculus Avatar SDK is now available for download, along with documentation.

One of the things that the Avatar SDK can help us with is putting a pair of virtual hands into our VR apps.

Sample code for rendering virtual hands is provided in the Avatar SDK ‘Mirror’ Visual Studio solution. Note that the generate_projects.cmd file will spit out a Visual Studio 2013 solution – I edited the file in Notepad to spit out Visual Studio 2015 instead. Another thing to note, the libovravatar.lib file is 64 bit – there is a query on the Oculus forums as to when a 32 bit lib will be available, but for now I recreated my virtual world in 64 bit so as to integrate the lib.

Okay, my virtual world now has the code to render virtual hands. Let’s put my grubby mitts to the test:

A webcam is capturing images of me in the real world brandishing the Touch controllers, and putting those images onto the face of a virtual cube. Can you see me? Not only that, but goddamn it, a pair of virtual hands are also being rendered into the virtual world!

When I move my hands close to my face in the real world, those virtual hands lift up to my eyes as I peer into the virtual reality headset.

When I raise my real thumbs, my virtual hands give the thumbs up.

My virtual hands really feel part of my body, at just the right position in front of me. I want to reach out and grab something virtual!

I point with my index fingers in the real world and the Touch controllers are smart enough to do the same with my virtual hands. That’s because the controllers can sense not only when I am pressing a trigger but also when I am gently resting my finger on a trigger. Or not.

I have programmed the thumbsticks on the Touch controllers so I can swan about my virtual world. Watch me go!

If I squeeze the right Touch controller, my right virtual hand squeezes too. But when I do the same with my left hand a cone appears in the virtual world. That is because I have coded the left Touch controller to show a virtual cone when its hand trigger is pressed (and the controller is held at a certain height). The controller also vibrates, as it has also been programmed to provide haptic feedback.

I put the Touch controllers down on my desk at the end of the demo and they disappear from my virtual view. As if severed at the wrist.

So there you have it, the perfect pair of virtual hands for our virtual world, perfectly mimicking our real hands. With virtual hands we can start coding them to make rude gestures, to pick up and throw bombs, punch stupid fat faces, strangle necks, steal a wallet from a banker and lots of other productive acts.

Can’t wait to get started!

Ciao

https://myspace.com/rdmilligan/music/album/bones-ep-19172460