Everyone has been talking about Oculus Rift and its impact on the next generation of gaming. By submerging users into 3D virtual environment, it's immediately certain how gaming experience, especially those of First Person games, would be enhanced. But the current discussion about this peripheral still hasn't shown its greatest potential yet. Once everything is set and ready, Oculus Rift might mark the next wave of PC form factor that might change ALL our interaction with computers.

It's not surprising for something that's been developed for gaming to have found greater success when used outside of gaming environment instead. Take Kinect for example, what has been touted as a new form of gaming control, has instead found greater use outside of gaming space. And same thing might goes with Oculus Rift as well.

Without a doubt, Oculus Rift as a gaming platform is incredible. But what if we take it out of game space? What if we use Oculus Rift for working environment? What if someone create a unique OS that specifically use virtual space for how we work, how we create?

Imagine a work space that no longer being confined to the rectangle in front of you, but it's also on your side, above or below you. We can have several working windows opened at once, but they're not stacked one on top of the other, but instead placed to the side, front or back. Current PC form with multiple screen will be extinct just like that.


Picture: ARI - Heavy Rain


Picture: Dead Space

With that vision of the output that we might get, we will need to reconfigure how we interact with the OS. Because the way we interact with the OS will be totally different. We need a totally rebuild input from ground up for this new OS. Of this we have several choices, though as of right now, mostly are still in the realm of science fiction, with the most prominent being the glove from ARI - Heavy Rain or Minority Report.


Picture: Minority Report

This might be the next wave of PC revolution. Just like how Apple revolutionized the whole computing form with iPhone and its touch interface, and building upon it into new form factors of mobile/tablets. With the correct implementation, Virtual 3D environment such as Oculus might be the next force that revolutionized PC form factor.


Edit: after some research into several other input device currently in development, I have several ideas of what might work for Oculus Rift.

1. Typing: Hardware keyboard is essentially useless in Rift, mainly because we just plainly can't see where our finger is located and there are too many keys for our muscle memory, unless you're a regular typist. But currently, the inventor behind much loved Swype android keyboard is working on another touch typing keyboard specifically for tablet interface. The new keyboard is called Dryft. It basically detect your hand starting location on touch surface and create the keyboard accordingly following their placement. I suppose with some work this interface can be mirrored from tablet screen to work inside OR, so that we can semi blind type on the tablet with the keyboard hovering virtually in OR.


2. Gesture: The most easy to implement would be Leap Motion. It can easily be setup on the table so that we can do gesture to control OR. Additional plus point for not tiring you as much by having a table to rest your hand. But in using it, what kind of gesture that will work intuitively would require a lot of experimentation (not unlike how we evolve our gesture touch interface from 7 years ago).

3. Cursor: We have to realize that cursor in virtual space would work completely different from how it work on your screen. Mainly because now we have depth, the 3rd dimension to deal with. While Leap Motion can detect gesture and your hand, I doubt it can pinpoint its location correctly. Especially since now we have a screen that might move independently from your mouse pointer. I have a basic idea though not quite sure if it might work (or how to built it).


Seeing as the new OR comes with a combination of camera + Track IR to sense head movement, what if we add a different set of IR on glove finger tips that automatically track your finger position in relation to your head position? It then translate that position as a cursor inside the virtual space. Since it track at the same time with your head movement, no matter how you turn, you'll instinctively know where your cursor /hand should be. Then we can have a virtual console in them that can react to our finger. And to provide feedback of clicking/touching we would need addtional rumble motor on the finger tips as well.

PS: If a glove IR is too radical, I suppose something along the line of wiimote might work just as well.