In my previous post, I talked about some of the highlights from the 2013 Neurogaming Conference and Expo. Did I mention that the Expo was in a nightclub? So basically on a pleasant weekday in early May, while you were probably engaged in performing or avoiding something at work, I was hanging out in a nightclub making drones fly around with my brain. Man, I’ve always wanted to say that!
Anyway, where was I? Ah yes, the bar had just opened. My go-to industry event cocktail tends to be a Gin & Tonic. It tastes great in a plastic cup and it’s almost impossible to screw up. So drink in hand, I continued my explorations.
Tactical Haptics: These guys are developing a technology called Reactive Grip which is partially based on technologies developed out of the University of Utah. I’m not a gamer, so please feel free to correct me if I get some of this wrong. In a sense, they are extending the general paradigm of a motion-based game controller such as the Razer Hydra by Sixense by adding haptic feedback. Sounds straightforward enough, but it’s not. It was one of my favorite demos at the Expo, and I’ve been trying to figure out why. Basically, my limited experience with haptic feedback is that it’s like the 3D of the gaming world. With rare exception, studios are churning out 3D movies because they feel like they’re supposed to, and the end result is an obligatory, thoughtlessly-executed assault that is ultimately distracting. Haptic feedback can be the same way: I’m firing a big gun or driving a big car, my hands are buzzing, OK, I get it, tell me something I don’t know. I don’t feel that much more involved in the experience just because I’m feeling the exact same buzzing sensation too many times at totally predictable points.
Tactical Haptics takes it much further. First of all, the haptic feedback in Reactive Grip is palm-facing so it’s accessing what is probably a more vulnerable and sensitive part of your hand. As a result, it lends itself to enhancing the types of actions that you would perform while gripping something with your palm and fingers. I have found a disconnect between the way you hold a standard haptic game controller and the object it intends to emulate, e.g., steering wheel, gun, etc. However, the form factor of Reactive Grip maps pretty closely to how you would hold a sword, for instance. In fact, one of their demos featured a virtual on-screen mannequin that you could hack with a sword. The continuity of what you see on the screen with what you feel in your hand is unlike anything I’d ever experienced. Even the level of feedback was regulated based on whether you were “touching” the mannequin or attempting to cut through it. The effect is uncanny, which is usually a good indication that you’re on to something big. Rather than providing a token enhancement, this technology really pulls you into the experience by engaging a critical sense in a well-considered way. These guys are looking to launch a kickstarter and they’re hustling on the road, making appearances at Meetups in the Bay Area. Worth checking out!
NextGen Interactions: I had been wanting to check this out for a while, because it allowed hands-on (or would that be heads-on) experience with the Oculus Rift virtual reality headset. One of the guys in line mentioned that there had been “a line to get into the line” to try Oculus Rift at the recent Game Developers Conference, so waiting for a few minutes seemed like a good deal. Jason, the founder of NextGen Interactions, was allowing visitors to experience a prototype of a game that he was developing that featured the headset. At this point, it’s hard to say anything about experiencing Oculus Rift for the first time that hasn’t been said before, but the general consensus, with which I agree, can be accurately summed up as HOLY $#!+!!!
It’s real easy. You sit down in the chair, put on the goggles, and the world that you know disappears. Let me say that again. The world disappears. The first thing I did when I put on the goggles was look up and then look behind me. Yup, looking up at a tall building or back at an empty landscape. Total 360 degree immersion. Jason incorporated Razer Hydra as the controller which makes for a very intuitive way to move around, and once you start moving, it is impossible to continue believing that your chair is not actually moving. Actually, you basically forget all about the chair. Jason patiently walked me through the level he designed, where he had embedded some features that I appreciated including puzzles that required you to interact directly with your environment like picking up objects, stacking them, etc., which you literally do with your hands, thanks to the Hydra. As a testament to the level of immersion though, I found it very difficult to focus on what he was saying, because the whole concept of his voice coming in from some nightclub in San Francisco was totally alien to what was “real” for me, which was a post-apocalyptic landscape that I was intent to explore. After about ten minutes of this, it was time to move on, and I actually found it somewhat disappointing to return to the real world. Jason was meticulous about collecting feedback, and apparently my enjoyment of the Gin & Tonic made me a subject of interest with regard to the potential for motion sickness. I admit to a slight feeling of queasiness which I think I would have felt even without the drink. Hopefully, he had a control group. Oculus Rift did not invent virtual reality, but they seem to have figured out a way to engineer it in a way that will be consumer-friendly (i.e., it won’t cost thousands of dollars.) With developers like NextGen Interactions building content for the platform, it shouldn’t take long to catch on.
Coming Soon: Isaac Asimov posthumously sighs at what I’m getting out of the Foundation Trilogy.Follow @thememelab