From the very beginning we set out to make games that allow a person to actually look with their head, walk with their feet, and interact with their hands like in real life. In modern first person games, one uses a mouse to look around and shoot, and a keyboard to move – so with Project Holodeck we had to throw conventional wisdom out the window. We had some big design challenges because we could not do what all the other games out there are doing.
For instance, we could not just have a menu that you click on with your mouse, instead we had to build several virtually spaced elements/buttons that you need to interact with across multiple sets of game menus (or “Ready Rooms”). Hud elements had to be kept to a minimum with the rare exception of contextually activated reticles. For example in Wild Skies, the ship health is displayed on a small floating orb that changes colors – however the ship itself has rings and other color indicators built onto it so that health information is displayed as diegetically as possible.
For story, we could not momentarily remove control of the player camera to craft cinematic experiences. Instead we chose to use in-game screens and radios to direct player attention. Its a lot of the same environmental storytelling conventions we learned from Half-Life 2, but crafted specifically for fully embodied VR.
Most people do not understand how difficult it is to make networked games, especially when you are already dealing with things like flying ships, VR hardware, and all that. It was a massive challenge to run three virtual worlds simultaneously across server and clients, having two people observe and interact with each other’s in-game avatars in real-time, and allow them to move about in a shared space and pick up shotguns and sniper rifles with motion tracking, all while flying at 100 miles per hour, in three dimensions, being attacked by enemy AI that dispaches its own groups of boarding pirates and shoots homing cannons. I kid you not. Biggest challenge in engineering history.
We run the client games on two backpack laptops that are worn by the players, that are independently and concurrently simulating the game. This is the only way to deliver such an immersive experience with the existing technology at a low cost.
Because of this, the three instances of the virtual reality “universes” must be synchronized in real time to deliver a convincing mutual experience.
The server camera is central to our system. To the audience, it is a window to the virtual reality world that we have crafted. It’s a lot like watching an animated movie, but the actors are acting it out in real life, in real-time, in the playspace. You could call it the theater of 21st century.
Check out our server camera technique and virtual playspace in action at USC Games Demo Day 2013 below:
Editor’s Note: After 300 all-nighters working towards Demo Day 2013, Alex is now in hibernation. While he’s not that fond of blogging, journalistic writing, or even idle conversation, you are welcome to contact him at firstname.lastname@example.org. He might tolerate your email if you pose a question worth his time