AWS for Games Blog

Now Available – VR Sample

We’re proud to announce the availability of the Virtual Reality Samples Project, a set of three levels that demonstrates techniques used in the production of VR content. The new project, available as a standalone download here, showcases VR features including stereo video playback, haptic feedback, and spatial audio. These levels provide a starting point from which developers can build their own VR projects.

We believe rapid iteration is crucial to the development of great products, so we’re invested in making tools that are easy to learn and accelerate the creation of developers’ most ambitious projects. Last year, we re-architected the VR system to be more modular and scalable by moving device support to the Lumberyard Gems system, enabling developers to add support for any VR devices they desire. To further reduce iteration time, we implemented the “VR Preview” button to enable instant testing of VR applications directly from the Editor. Recently, in the Lumberyard Beta 1.7 release, we created platform-agnostic tools that let developers build once and deploy on multiple VR platforms in order to help eliminate the effort involved in working around platform-specific peculiarities. As we look to the coming months, our team is excited to continue to simplify and expand tools and samples that let developers build great VR experiences quickly. The VR Samples Project demonstrates many techniques used to more quickly create VR content, and we’d like to walk you through three specific examples:

Video Playback
We build our roadmap entirely from our customers’ needs, and video playback has been one of the most requested features for Lumberyard. For example, one of our VR customers wanted to record 3D video of scenes from an upcoming movie in order to develop a complementary VR experience with the release of the film. To help them achieve this, we added the Video Playback Gem in Lumberyard Beta 1.7, which allows video assets to be inserted into a game scene. When the Video Playback Component is added to any Component Entity in a scene, it will load and play back video up to 4K resolution at 60fps in mono or stereo. You can use either the Ffmpeg or LibAV libraries, which support every major video container and codec (e.g.MP4 with H.264 and H.265/HEVC, WebM/MKV with VP8 and VP9, MOV with the Quicktime QTRLE, and more). The video is rendered to a texture that can be applied to any surface in the world, and just like any texture, can have material effects applied to it. Because performance is especially critical to our VR customers, we designed the video playback system to make efficient use of render targets so that a single video can be rendered onto potentially hundreds of surfaces simultaneously with no loss in performance.

vr-sample-many-tvs

The Video Playback component can be placed on a sphere to create an immersive 360-degree video playback experience. To take advantage of any existing content you may have, the component accepts videos in top-bottom or bottom-top stereo and supports animated alpha masks with MOV video, which is useful for those who are experimenting with compositing video in a 3D scene.

The TV Room level provides a demonstration of the complete range of functionality of the Video Playback Gem using a TV model with a stereo video texture applied to it. You can switch between three different video styles showing 2D, 3D and spherical playback. The level demonstrates how to load video files to play on any surface, how to control video playback with Flow Graph or Lua, and how to set up stereo 360-degree videos. Try increasing the number of TVs in the sample from 1 to 100’s to see the performance for yourself!

vr-sample-tv-room

Motion Controller Haptic Feedback
Haptic feedback is an important feature of VR, because it can add subtle texture or bombastic feedback to enhance presence in your virtual world. Lumberyard’s force-feedback system has been updated to allow motion controllers to be receivers of force-feedback events, letting developers leverage all of the existing force-feedback functionality already present in Lumberyard. In Lumberyard Beta 1.7, we’ve updated the Oculus and OpenVR Gems to include haptic feedback for the Oculus and Vive motion controllers. As mentioned previously, the modular nature of the Lumberyard VR system lets developers author their own Gems to support the rapidly growing ecosystem of VR devices.

vr-sample-box-garden

The Box Garden level demonstrates four fundamental features of a VR application: haptic feedback, playspace control, motion controller tracking, and handling input from motion controllers. Haptic feedback is demonstrated using a Flow Graph script, which uses force-feedback envelopes specified in XML to author custom effect magnitude and duration. The playspace control gives access to the dimensions of the playspace for room scale applications. We demonstrate a general use-case of scaling floor space in the Box Garden level, but creative developers could theoretically use these outputs for any purpose, including scaling the number of entities that spawn in a space or scaling the objects within the space. The virtual motion controllers in this sample are linked directly to the physical motion controller transforms that are output from flow nodes. In the future, we will be moving this functionality to a simple drag-and-drop component that can be attached to any Component Entity, and switched based on the attached device ID, so you can support a wide variety of controller models using a single component.

Audio Production for VR
In order to provide a feeling of presence in an immersive VR world, players need to be able to reconcile the position of audio cues associated with visual activity in a manner that is consistent with the real world. VR audio tweaks the traditional 3D audio model to give you a more personal sense of listening in the environment. Game audio is traditionally “flattened” into a 2D field and spatialized about the listener based on their orientation, but with VR you need to do more. When you move your head around, the sound field needs to make sense to your brain. VR audio requires techniques to trick the ears into knowing if sounds are in front or behind you. We use filtering and micro-delays to approximate the time it takes for a sound to reach each ear individually to accomplish this effect

The Xylophone Sample was created to demonstrate spatial audio for our customers who are using the full licensed version of Audiokinetic’s Wwise with the Audiokinetic Oculus Spatializer plugin. The level lets you fire blocks at a giant xylophone that plays simple notes, and topple a giant set of dominoes set up in a pattern to play a familiar song. As you turn your head to look around, you should be able to naturally pinpoint where the sound is coming from.

vr-sample-xylophone

SDK Updates in Beta 1.7
Lumberyard Beta 1.7 also updates the Oculus SDK from 1.5 to 1.9, and OpenVR SDK from 1.0.0 to 1.0.3. Please see our release notes for further information.

What’s Next?
VR is still in its infancy and we’re excited to see what emerges as ambitious developers strive to create spectacular VR experiences. Our team is excited to work with those customers, and provide the tools they need to explore this new frontier. If you’re going to be at GDC this year, please swing by our booth to meet the Lumberyard VR team, check out the technology we’re currently working on, and tell us about the things you’d like to see in the future. And if you’re not going to be at GDC, please let us know what we can do to help on the forums.

Download the VR Sample Project here.

 

About Our Contributors

Andrew Goulding has shipped games on every major platform since the Game Boy Advance, working in gameplay engineering, design, production, and writing. He’s been at Amazon for over two years and is one of the Lumberyard VR team’s senior engineering leads. He spends his time dreaming up and building tools that will enable virtual and augmented reality to revolutionize entertainment, business and social experiences.

In Young Yang joined Amazon in 2015, bringing experience from popular AAA projects, including the Halo and Tom Clancy series. In her current role as content lead on the Lumberyard VR team, she is excited to explore new technologies that can deliver transformative and immersive user experiences.