SteamVR Tracking License Day 2

SteamVR Tracking License Day 2

Full Day

Alright!  Today was our first full on training day, and we did nothing but learn from 9:00 – 5:00; it was so amazing!  We’re all very much friends in the class at this point, cracking jokes and sharing ideas.  Synapse does a great job at creating a fantastic educational environment; I’d be eager to return here in the future if I ever have the opportunity.

Highlights for the second day are that we learned how to simulate and optimize sensor placement, and we’re all getting pretty good at knowing what to look out for when designing tracked objects so they are sure to have the best tracking.  Designing for lighthouse is very much a skill that comes with practice, and over time you develop an intuition for what will work best.

Simulation

So before diving into any physical hardware, Valve has created a magnificent visualization tool called HMD Designer.  It’s got a few different options, but ultimately you feed it a 3D .STL model and it will spit out possible optimized sensor positions and angles for the best quality of tracking.  Another thing you can do is feed it a .JSON file with the sensor positions already defined, but we’ll get into that later.

A great and key feature of HMD Designer is that it outputs multiple visualizations, primarily used to see how the lighthouse protocols interact with your object and its current sensor positions.  FYI, dense lighthouse science ahead.

HMD Designer Graphs

In this graphic (click to zoom), a 2D representation of how simulated tracking performs with your specific object helps tell us where any potential shortcomings of the current design may be.  The bluer, the better!  Also, regarding the format of these graphs, we are looking at unwrapped 3D visualizations (we’ll see the wrapped versions later).  The left and right outer edges are -Z, centre is +Z, top is +Y, bottom -Y, and central-left is -X, central-right +X.  If that doesn’t make sense, bear with me until the end of this section.

In the first graph, we see how many sensors can see the lighthouse at any given surface point of the object.  The more sensors visible, the more datapoints we get!  This is great because with more datapoints comes redundancy, and that really helps lock down a very definite pose.

In the second graph, we’ve got a heatmap of potential rotation errors.  These errors are caused by all visible sensors residing on a very similar plane (discussed briefly in Day 1), and because of that, SteamVR will have a very tough time negotiating minute changes of rotations which face the lighthouse, as such rotations will frequently make very little difference to the time between contacting the lighthouse scan.  Solving this problem can be done through adding / modifying sensors to be out of the plane created by other visible sensors.

For the third visualization, the simulation tells us if acquiring an initial pose is possible from various positions.  An initial pose is acquired fully through the IR sensors – you can’t use an IMU for tracking if you don’t know where object was previously.  Because of this, you need at least four sensors visible from a position, one removed from the plane of the other three (again, as you’ll notice, this fact is very important and more or less is the core of sensor placement design).

And finally, pose translation error rises from sensors being strictly colinear.  First of all, if your sensors are colinear, you’re going to run into more problems than just pose translation error – namely rotation error since if the sensors don’t even have a plane, you certainly won’t have any sensors removed from one.  Then either of those errors result in a failure to capture an initial pose.  Secondly, if your sensors are colinear and are struck by the laser at specific times, then SteamVR has no way of understanding the rotation and position from these colinear sensors alone.  There are a lot of possible positions in which the problem sensors would be struck at those times!  Fixing this is simply pulling a sensor out of line (and hopefully out of plane as to avoid rotation errors).

3D Viz3D Viz Rotation 13D Viz Rotation 23D Viz Rotation 3

Then these images show Valve’s 3D visualization software that wraps the 2D graphs shown above onto a visualization sphere surrounding the object being simulated (in this example the object is a sort of beveled block).  In the second 3D visualization, I have enabled rendering of the pose rotation error, which means that from the user’s point of view (which is the point of view of the lighthouse), pitch and yaw have potential to be uncertain in areas with more red.  It comes without surprise that the rear of this object has complete failure to capture or estimate rotation, seeing as there are no sensors on that face!  If you look at the perspective where that yellow cusp is within the reticle, in the third 3D viz capture, you can identify that most sensors in view reside on a plane that is normal to the perspective.  If it weren’t for that fourth sensor inside the cusp, that area would be as red as the underside of the object is!

JSON

So all this information is great and valuable, but it doesn’t mean much to a computer as a bunch of colourful graphics.  That’s where the JSON comes in.  The JSON file is host to all important information unique to the device.  Sensor positions, IDs, normals, IMU placement, render model, and a few other less important identifiers are all included in this magical file.  The JSON is stored on the device, and is then presented to SteamVR when initially plugged in.  Using the information contained within the JSON, SteamVR now knows how to interpret all incoming signal data from the device, and that means we’ve got sub-millimeter tracking in VR! (cough after calibration cough)

Optics

After all that high-level optimization and reporting, we got a brief rundown of optics and the challenges that come with protecting infra-red sensors for consumer use.  Valve and Synapse have conducted a number of scientific analyses of how IR light interacts with different types of materials in different situations, and how those interactions effect sensor accuracy when receiving lighthouse scans.  Comparing elements such as transparent vs. diffuse materials, sensor distance from cover, apertures, and chamfers, Valve and Synapse have come to the conclusion that a thin, diffuse material that is opaque to visible light but diffuse to IR is best for lighthouse reception.  Additionally, an aperture is added around the diffuse material so that the light doesn’t activate the sensor before it should actually be “hit.”

Hardware!

Then came the boxes.  The glorious, unmarked cardboard boxes!  We got a lot of goodies inside (I’ll be sharing my development with them over the next few weeks), but the most impressive tool is pictured below (thanks to Michael McClenaghan for the photography):

Watchman

This is the Watchman board!  On this little thing – only about a square inch – is just about everything needed to have a fully functioning SteamVR tracked device.  The components it requires a (very easy) connection to are a battery, an antenna, a USB port, and of course the IR sensors.

Other fun hardware treats we received were IR sensor suite “chicklets,” which contain an IR sensor, Triad Semiconductor’s TS3633 ASIC, and a small handful of discrete components, some breakout/evaluation boards, input devices (including a trackpad identical to those included in Vive and Steam Controllers!), and some assorted ribbon cables.  Oh, and of course – the reference objects!

Experimentation

Synapse was kind enough to give us premade reference objects (which are really dense, by the way.  One of the file names associated with it is “thors_hammer” and it is certainly worthy of such a name).  These objects can be used as digital input devices, or if the handles are disconnected, they are large enough to fit over most modern-sized HMD’s for easy SteamVR Tracking integration!  Photoed below is a group of us testing out our reference objects.

Reference Objects

One of my favourite things about this image is how many controllers are connected in the SteamVR status window.  Also, if you don’t know who I am, I’m the lad wearing the rainbow shirt.  Hi!

Another reason we’re flailing our controllers around is to calibrate the sensor positions.  When you prepare your files on a computer, all the sensors are in ideal locations.  This, of course, is not how the real world works.  So, through the magic of Valve UI, by flailing these unwieldy devices around for a minute or two, the sensors get a pretty good idea of where they are all actually located.  We then take these corrected positions and write a new JSON file to the controller via the lighthouse console (this console is the bread and butter of interfacing with your tracked objects – you can extract firmware, stream IMU data, enter calibration, etc).

Steam Controller

Fooling around a bit more, I decided to alter the JSON file on my reference object so it would be identified as a Steam Controller.  And lo and behold!  A Steam Controller in the SteamVR white room!  Positionally tracked in all its glory.

And then to close the night, Synapse hosted a Happy Hour for us all to be social and inevitably geek out about VR together.  We chatted about hardware, locomotion, multiplayer, narrative… I love this industry.

As usual, please feel free to post to the comments or email me at blog@talariavr.com with any questions or conversation.  Be on the lookout for my summary post where I will answer community questions in bulk once the training is done.  Cheers!

8
Leave a Reply

avatar
2 Comment threads
6 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
3 Comment authors
Peter S. HollanderYouTube Mp3Mark Hogan Recent comment authors
  Subscribe  
newest oldest
Notify of
Mark Hogan
Guest
Mark Hogan

Hi Peter,
Thanks for continuing this great account!
You mentioned the sensors have an aperture, does that just mean they’re inset from the surface?

YouTube Mp3
Guest

That’s really excellent! Does this still work?