It was a wild weekend at Indiecade 2013! Project Holodeck showcased in the Digital Selects tent and had massive crowds throughout the duration of the festival. It was hard to keep everyone waiting for so long – we hope to have multiple systems up and running next time! Check out the Indiecade Video Tour at Joystick featuring our virtual reality antics with the full Holodeck system, and then head over to our very own Neon Tommy for a writeup on Project Holodeck and other USC affiliated projects. Robert Nashak also wrote a fantastic article on KCET about Indiecade – be sure to VOTE for it to be turned into a documentary!
Here’s a few photos we’ve uncovered from the interwebs:
More photos courtesy of Neon Tommy on Flickr
Sweden Game Conference 2013: Showcasing New Project Holodeck Prototypes at Gothia Science Park in Skövde
We’re showcasing Project Holodeck this week at the Sweden Game Conference in Skövde! We’re absolutely thrilled to be here, and this is the first opportunity we have had to demo our new prototypes with the public. Our new prototypes (we call them V0.2) utilize a custom optical solution that eliminates the need for a Playstation 3 and PS Move controller, and they also have custom backpack enclosures made with our new Makerbot Replicator 2.
The Sweden Game Conference focuses on two speaker tracks: one on business and one on game development. On the game development track, Nathan Burba is going to be speaking on the technical challenges and breakthroughs we had on Project Holodeck when using a myriad of motion tracking devices and several different VR headset iterations. On the business track, James Iliff is speaking about the evolution of Project Holodeck and the future market of consumer VR. There are also talks by a number of developers that we are very excited to see, such as Linden Lab and Coffee Stain Studios! With Linden Lab’s recent interest in integrating Second Life with the Oculus Rift, this conference is especially significant and timely.
There will also be a panel on Virtual Reality and the Future of Gaming! You can find more information on the talks and panels here.
It was a wild success! On Tuesday we showcased Wild Skies to a theatre full of 500 game industry veterans, journalists, faculty, and students, and due to the tireless efforts of the School of Cinematic Arts staff and event coordinators, we managed to pull off a really strong demo. We went through three tech rehearsals and numerous all-nighters to get the experience ‘audience ready’ in a way that worked well with a massive cinema projector.
And here’s a short article on Project Holodeck Demo Day by Ben Lang of RoadtoVR:
The Norris Cinema Theatre was a truly amazing venue, and Demo Day was a blast. A cocktail reception followed in the Spielberg Building where people got to try out the Holodeck experience themselves! Photos:
From the very beginning we set out to make games that allow a person to actually look with their head, walk with their feet, and interact with their hands like in real life. In modern first person games, one uses a mouse to look around and shoot, and a keyboard to move – so with Project Holodeck we had to throw conventional wisdom out the window. We had some big design challenges because we could not do what all the other games out there are doing.
For instance, we could not just have a menu that you click on with your mouse, instead we had to build several virtually spaced elements/buttons that you need to interact with across multiple sets of game menus (or “Ready Rooms”). Hud elements had to be kept to a minimum with the rare exception of contextually activated reticles. For example in Wild Skies, the ship health is displayed on a small floating orb that changes colors – however the ship itself has rings and other color indicators built onto it so that health information is displayed as diegetically as possible.
For story, we could not momentarily remove control of the player camera to craft cinematic experiences. Instead we chose to use in-game screens and radios to direct player attention. Its a lot of the same environmental storytelling conventions we learned from Half-Life 2, but crafted specifically for fully embodied VR.
Most people do not understand how difficult it is to make networked games, especially when you are already dealing with things like flying ships, VR hardware, and all that. It was a massive challenge to run three virtual worlds simultaneously across server and clients, having two people observe and interact with each other’s in-game avatars in real-time, and allow them to move about in a shared space and pick up shotguns and sniper rifles with motion tracking, all while flying at 100 miles per hour, in three dimensions, being attacked by enemy AI that dispaches its own groups of boarding pirates and shoots homing cannons. I kid you not. Biggest challenge in engineering history.
We run the client games on two backpack laptops that are worn by the players, that are independently and concurrently simulating the game. This is the only way to deliver such an immersive experience with the existing technology at a low cost.
Because of this, the three instances of the virtual reality “universes” must be synchronized in real time to deliver a convincing mutual experience.
The server camera is central to our system. To the audience, it is a window to the virtual reality world that we have crafted. It’s a lot like watching an animated movie, but the actors are acting it out in real life, in real-time, in the playspace. You could call it the theater of 21st century.
Check out our server camera technique and virtual playspace in action at USC Games Demo Day 2013 below:
Editor’s Note: After 300 all-nighters working towards Demo Day 2013, Alex is now in hibernation. While he’s not that fond of blogging, journalistic writing, or even idle conversation, you are welcome to contact him at firstname.lastname@example.org. He might tolerate your email if you pose a question worth his time
Demo Day at the USC School of Cinematic Arts is just around the corner, and we just whipped up our latest batch of screenshots. Wild Skies has undergone a massive visual overhaul in the past two months, thanks to the talented individuals at the Gnomon School of Visual Effects. Check out their incredible work in 3d modeling and concept art below!
You can also see a lot of new weapons, such as sniper rifle and machine gun. These guns are held with two hands using Razer Hydra motion controls, giving an even more natural sense of control to weapon combat in a virtual environment. They include simulated recoil, so if you only hold with one hand the gun has much more recoil than if steadied with two hands.
Zombies on the Holodeck is going to be at the 5D Institute’s Science of Fiction exhibition this weekend! This event will bring together leading minds from a wide spectrum of creative and scientific fields to imagine and create new worlds on the borderland between art and science. Project Holodeck is super excited to be participating!
What’s the game?
Zombies on the Holodeck uses the modular Holodeck platform to immerse players in a survival horror environment. Players use the Oculus Rift to explore with peripheral vision and head tracking, and use Razer Hydra controllers to shoot guns, pick up items, and build defenses. If players choose to use all the Holodeck hardware, the game can support two players together over a network, sharing a playspace, and moving in 360 degrees with positional tracking. Alternatively, if all you have is a laptop, you can just use mouse and keyboard – although admittedly this is not ideal! Check out our previous post on the philosophy behind the game and why we want to explore the boundaries of virtual reality as entertainment.
The Science of Fiction:
Science of Fiction will move beyond tired binaries to present a one day World Building event. Teams that include leading theorists and practitioners from a wide spectrum of creative and scientific disciplines will join with young innovators who are working on the bleeding edge. Using the narrative design principles of World Building, these revolutionary minds will work together to create an alternate visionary world, a near future reality that could lie just around the corner if only given the right impetus. This unprecedented event will launch a unique and persistent world designed to evolve through continuing transmedia projects at USC and 5D Institute.
What is 5D Institute?
5D investigates the future of narrative media, it is the gathering place for those who will explore and mediate storytelling in the new creative and technology spaces beyond and between entertainment media. 5D uses the centrality of immersive design as a knife to cut through the membrane and across the silos of academia and industry. In this it is unique, and since 2007, 5D Institute has build the framework and network to allow the narrative design space to become a portal for all media makers.
Check out the 5D website for more info!
At the GLIMPSE Digital Technology Showcase last month at the School of Cinematic Arts, local news KTLA 5 came by to check out Project Holodeck and get some interviews. We can’t embed the video on our own blog, but you can check out the news spot here. Many projects at the MxR Lab got some great coverage as well including the FOV2GO and VR2GO viewers.
Zombies on the Holodeck also got some attention from Stuff.tv, and featured some incredible mood pieces courtesy of our super talented concept artist, Sung J Woo. We also got an extensive article from Jonathan Tustain of 3D Focus. Thanks for the awesome interview Jon!
Also, Palmer and Oculus have a metric ton of updates coming up – if you missed Palmer at the SXSW panel definitely check out the recap from the Verge. Also keep an eye out for the VR panel at GDC ’13. Oculus VR will present Running the VR Gauntlet -VR Ready, Are You? with chief software architect Michael Antonov and co-founder Nate Mitchell.
Project Holodeck is super busy preparing for our Midterm showcase and the eventual Demo Day part 2 in Norris Cinema Theatre this May. More updates on the way soon!
Zombies on the Holodeck! New Game for Project Holodeck and Oculus Rift Plays with Film Conventions in Virtual Reality
Virtual Reality is a young medium. Zombies on the Holodeck was started to explore virtual environments outside of strict realism. Recreating a wholly photorealistic world is still a huge thing to explore in this emerging field. However, because Virtual Reality offers the opportunity to create alternate realities altogether – why be bound to the notion that the alternate reality must strive to behave like normal?
In that case, how does Film relate to Virtual Reality?
The Idea – Living in a Movie
Zombies on the Holodeck uses the modular Holodeck platform to immerse players in a survival horror environment. Players use the Oculus Rift to explore with peripheral vision and head tracking, and use Razer Hydra controllers to shoot guns, pick up items, and build defenses. If players choose to use all the Holodeck hardware, the game can support two players together over a network, sharing a playspace, and moving in 360 degrees with positional tracking. Alternatively, if all you have is a laptop, you can just use mouse and keyboard – although admittedly this is not ideal!
But hardware aside, the inspiration for Zombies on the Holodeck is to explore what VR might feel like if it was combined with film conventions – meaning the goal is not to convince the player that they are in a real world, but to convince the player that they are inside a film. That is the foundational idea. Since the VR medium is still so young, we assume its better to start with film conventions that audiences immediately know and recognize as opposed to crafting a subtle experience that attempts to innovate. Why? Because otherwise the player has way too much to process and the VR game loses focus very quickly.
Its a question of where to start. You’re not going to start out creating a VR experience that strives to be like Lost in Translation or Casablanca, because those films are full of subtlety and nuance that pushes the film medium itself to be innovative. However, you CAN start out crafting experiences like Jurassic Park or Night of the Living Dead – because those movies are all about action within a contained environment. In other words: let’s not get ahead of ourselves.
Its important to start with cliches that a large audience can recognize, so that everyone can easily grasp how exactly Virtual Reality makes it new and different. We know Jurassic Park on the silver screen, but what is that like if you were to actually live that experience in first person? How do we articulate with words how this is different in VR? This is literally an exercise in fleshing out the nature of VR as an entertainment medium.
With that in mind, the goal of Zombies on the Holodeck is to play with horror film conventions – specifically horror film conventions from around the time Horror was invented. If we’re going to start with recognizable cliches, why not start with the original ones? Let’s try circa 1940’s during the Golden Age, post silent era.
We give the VR game a title sequence, an old-fashioned musical score, and original Hollywood horror movie aesthetics – complete with lightning strikes and orchestral hits. The color palette is desaturated or even black and white. We have team members’ names fade in and out on the corners of the screen for the first minute, just like intro credits tend to do in movies. We play with elements on the screen, which means we’re actually playing with depth and parallax in a stereoscopic space. This is something that 99% of people have never witnessed before. No matter how we do it, it will be both cliche and totally new at the same time. In fact, thats the point!
The Game – What Makes this Unique?
The game mechanics you have seen a million times before – killing zombies, attack phases, build phases, ammo, health, and the like. So what’s the big deal? With Project Holodeck we learned very quickly that gameplay is intimately tied to your input devices. A game plays much differently when you are using mouse and keyboard, or a joystick, or an Xbox controller, or a PS Move wand, or a Hydra. Gameplay and Input are two sides of the same coin.
By innovating with the input devices, as we’ve done with fully embodied virtual reality in Project Holodeck, you can provide a fresh perspective on even the most tried-and-true game conventions.
The objective of Zombies on the Holodeck is of course to survive as long as possible. The game is set in alternate history 1940’s Chicago, where the population has transformed into monsters from biochemical warfare. The core mechanic is all about attacking incoming monsters and building defenses in a contained barricade or “Hold Out” spot in the streets somewhere deep within the city. These build / attack phases run in about 2 minute cycles.
In build phase, players have to use Hydra controls to physically pick up boxes and stack them to make defensive walls. They have to physically pick up wood boards and hammer them over doors – just like you would in real life. It’s a bit less complicated than hammering a nail in reality, but the motion becomes a gameplay mechanic in itself.
In attack phase, players will focus on killing incoming zombies with headshots. Weapons function with Razer Hydra motion controllers, making for very natural and intuitive gunplay. Players must aim down the sights of a gun to shoot – there are no reticles or HUD. A shotgun or machine gun requires two hands to operate.
Imagine how intense it is to be hammering a door shut with Hydra controllers, then hearing zombies approaching behind you from the street, and having to turn your head, pull out your handgun, aim the gun, and blow some heads off? And the gun doesn’t have infinite ammo either. In fact, you don’t know how much ammo you have unless you’re counting. To reload, you have to pick up a clip somewhere and make the physical motion to reload. The same goes for health. Say you gain health from stabbing yourself with an antidote syringe of some kind – in that case, you got to pick up a syringe and physically stab it into your arm!
Shooting, reloading, healing, building, and even moving are basic gameplay elements that are radically transformed with Virtual Reality input devices. By using the Holodeck platform and the Oculus Rift to craft an immersive experience in this way, we can truly explore how the VR medium is transforming the nature of gameplay, narrative, film, and everything in between.
The game has already made significant headway and we’ll be updating this website with a section dedicated to Zombies soon. The new trailer will be posted too. If you like the idea of Film and VR, or have ideas on how a VR survival horror game should be like, we would love to hear it! Leave a comment or shoot us an email.
“George, I wish you’d look at the nursery.”
In 1950, The Saturday Evening Post published The World the Children Made (later The Veldt) a chilling tale written by one of the masters of speculative fiction, the late Ray Bradbury. In this story, Peter and Wendy Hadley construct their own “Never-Never Land” in The Nursery – a magical room powered by technology.
As a huge Bradbury fan, when I heard that a friend of mine was a part of a team working on their own Nursery and was in need of a story to fill it with, I naturally jumped at the opportunity. Rather than lions on a desolate veldt, the first story to be told by Project Holodeck was to be one of battle in the open sky. This project was different from my earlier forays into fiction in that it was a collaborative process where many of the facts were already set. In many ways, the start of this experience mirrored my favorite part of being an engineer – namely, when you sit down with colleagues after a ton of work with a pile of data and try to find the physical explanation that ties it all together. Here we had a collection of interesting datapoints that needed explaining: two siblings on an airship, searching for their taken father, fighting off a myriad of diverse enemies, etc. Soon a likely model began to arise: the setting on an airship suggested a world where land travel could be hazardous or impossible, the variety of opponents suggested multiple means of propulsion, and the targeting of both the family and the father suggested that he may have some unique gift to offer and that the ship you were piloting may tie into that gift.
The volatile world of Galun and its habitable satellites, Larun and Daharun, quickly began to take shape. The driving force for the story was (much as is the case in our world) the importance of energy, all the more critical on a planet where survival depends not only on food and shelter, but on the ability to stay aloft. At center stage of this struggle is an inventor of a new source of energy that could greatly shift the balance of power, making both him and his prototype valuable commodities.
As Lead Writer, this whole process has been a fantastic opportunity to exercise my creative muscles in trying to help build a world tied (but not shackled) to scientific reality, especially in adapting and refining the story to incorporate the ever expanding set of capabilities presented by this new gaming platform and the ideas introduced by our great team of artists and designers. At every step, I have watched the world become more complete, the characters more real, and the gaming experience more immersive. Through this process, I hope that the end product will not only be a world that you can enter, but also one that you don’t want to leave.
The video is from April 2012 and features the FOV2GO mobile virtual reality kits, which have been showcased at conventions such as Maker Faire and SIGGRAPH. Mark also discusses the Wide5 Head-Mounted Display used for military simulations. One of the newer projects since this video is the Socket HMD, which the MxR Lab has been kind enough to lend to Project Holodeck for playtesting and showcases! You can read more about MxR Lab on their website here.