By: Meg Athavale
This is a true story.
In 2012, I was a young entrepreneur with a newly launched interactive projection software platform, and I was asked to participate in Canada's inaugural Tech Women Canada program. The idea is that women founders with technology companies have a lot of hurdles, especially outside Silicon Valley (we don't get funded nearly as often, and we almost always need to travel to find our market). Plus, Canadian cities just don't have as mature a funding ecosystem.
So I was chosen, along with 12 other women, to visit Silicon Valley, learn from lawyers and investors, and access mentorship from women in the valley who could connect us to resources and offer feedback once we returned home.
This was the year Leap Motion was announced. As someone whose entire business is designing and installing gesture and motion controlled interactive environments, I was super excited about this new type of sensor, which claimed to be able to track fingers so precisely you could use it for medical purposes. I phoned them. I emailed them. I really wanted to meet them. But no one answered the phone.
Being the scrappy little startup founder I was (and having no concept of how much more terrifying American law enforcement officers are), I tracked down an address, broke into a compound by climbing a fence, and left a little heart shaped note on a door declaring my boundless enthusiasm for their project, and asking if we could please be developers
We received a Leap a few months later. Our early experiments were less than awe inspiring, but we definitely had fun with it.
What we found was that using your fingers to control desktop applications wasn't intuitive; it was actually pretty clumsy compared to touchscreen and trackpad. Even the mouse was a simpler navigation tool. And no wonder... with touch and mouse, you have an affordance on the screen that can be precisely moved to where you want it, and firm, definitive actions with haptic feedback (like clicking or pushing) to make stuff happen.
This is where proprioception comes in. Using the Leap Motion (or Kinect, any other gesture based controller) to control your camera view, or to interact with any element on a screen, is really difficult. Your brain has a very refined sense of where your body parts are at all times. That's what proprioception is all about; it's your intuitive sense of where your body is and what it's doing.
This means that unless there is a smooth and predictable relationship between your body and an interactive element like a cursor, you don't feel like you're in control. Think about the last time you had a sketchy (or dirty) mouse. Basically, our first Leap Motion experience felt like that. Some of this is due to the infancy of gesture technology itself; even now there are few defined metaphors in gesture control that aren't holdovers from touchscreen interactions, like swipe and pinch-to-zoom.
While there are many examples of Leap being used to control desktop applications, the last few years has seen them diverge from what we needed in our business. Instead of tracking bigger areas and more body parts, they focused on perfecting hand tracking and moved from screen-based interaction to virtual reality.
Leap Motion quickly recognized that hand tracking is a huge and important problem to solve in virtual and augmented reality, and their sensor is uniquely well designed for the problem. This is in part related to the emergence of VR/AR headsets like Oculus and Meta. It's a great story; many cool hardware innovations have no real defined use case until a developer community joins the party. What Leap learned from their community was that their hardware filled a gaping hole in VR interface control devices.
To make it work, the Leap's sensor has to be positioned directly on a VR, which initially made for an awkward and uncomfortable experience, often involving duct tape or rubber bands. But as soon as developers started creating VR experiences in combination with Leap Motion, a lot of cool concepts started showing up in their newsletters. VR was obviously a much more active and enthusiastic development community than screen-based development.
Leap has recently announced Orion, a sensor specifically designed for VR. While it's still mounted on the headset, it's been streamlined and includes a simple mounting system, eliminating the early days of duct tape. The controls are also more refined, although as Nicole Lee of Engadget observes: "I didn't get the "grip" on boxes I would've liked and the controls aren't precise enough for me to create a game of Jenga."
So, proprioception, or the sense that your virtual hands are an exact and predictable extension of your real hands, is not yet fully realized. But thanks to Leap Motion's Orion - we're getting a lot closer to a hand tracking system that can pass what I think should forever be known as 'The Virtual Jenga Test'. And while our team has fewer and fewer opportunities to develop for the Leap Motion, our friends at The Campfire Union are definitely excited about the new Orion software.
Jocelyne is hoping to use our original Leap Motion to learn how to make her own version of Dog of Wisdom, so we will definitely continue playing around with it, and we'll post our projects as long as they aren't too embarrassing. :)