ARM’s Enlighten VR Development Diary – First Playable
Game development is an iterative process of exploring ideas in order to identify and create the right experience. Technical demo development is no different. In this next phase, we took the best elements and mechanics from the prototyping phase and consolidated them into a single playable experience. We then utilised a test-driven development process, where we would refine and iterate each of these ideas until our playable experience would have practical and exciting lighting mechanics with strong messaging points, which are both intuitive and fun.
Importance of play-testing
Given that VR is such a new platform for us in regards to development, one crucial lesson the prototyping phase taught us, was just how important regular play-testing (and then iterating from that) is in trying to achieve a great VR experience.
As an example of this, one of the initial prototypes we developed was this idea of real-time lighting editor, where the player could call up a 3D widget interface and physically grab different light types out of a menu to spawn them into existence. They could again physically manipulate and edit these lights directly within the environment they are in.
The mechanics of this play-tested really well and so we decided to implement it into our next prototype in the form of a holographic watch. The player could call up the hologram at will and pull various objects of interest out of their watch. We both really liked the way this worked, however when we play-tested it through, we were surprised to find that it just wasn’t intuitive enough to get working and easily use. We quickly realized that we needed to abandon this spawning watch concept and instead re-implement the same basic mechanics in a much simpler, stripped down version. Spawning interactive balls of light which the player can then use to light up a dark environment and influence the GI, had just the same affect we were looking to achieve, but in a much more intuitive form.
Without this test-driven approach to our development, we would never have foreseen those weaknesses in our gameplay – and ultimately would have compromised our final experience.
We optimized our development process around iterative test driven development. We had a very aggressive timeline of six weeks with two people on the team to ship the final build for VRX. Therefore we had to keep tight focus and cull any features which didn’t deliver rapidly.
We felt that a ‘real world’ planning tool would help focus us better than something digital. We therefore used a ‘wall’ where each page represents a week; each post-it is a task.
Moving the planning process out in the room allowed it to happen in person with the team members involved, speak up about any issues and give an accurate picture of the development. The focus of having a week to deliver an end-to-end playable experience to someone external to our team for testing helped all aspects of the work. The first milestone was to deliver a playable experience which showed our key focus: great dynamic global illumination in VR.
We knew that we had to be ready to change anything content or code wise in the experience as soon as our playtests showed it. All the content had to be highly modular and allow for exploring the ideas without spending time in a modelling package redoing polygons. At the same time we couldn’t review whether a given idea delivered ‘great GI in VR’ – central to our experience – without something approaching final lighting and materials.
We knew from our prototyping phase that we wanted an environment with many interreflecting surfaces, so we settled on a maze like environment. We wanted to communicate the sense of scale and height – having a strong three point perspective helped this. Lego-kit modular environment sections of a set scale: 50x50cm, 200x200cm and 400x400cm. This scale was chosen as we found them to work well in VR – not too big, nor too small. The environment layout happened in-editor, hence we were able to rapidly change it if needed.
Our main environment texture was a sort of concrete prefab block done in Substance Designer. We kept some strong lines in the texture to support perspective. UV’s were projected in world space in-editor to keep them independent of the geometry. This was made possible by our environment sections being aligned to XYZ in world space. Once we had enough environment built, the multiple strong perspective lines really worked well when combined with large scale graduation from the indirect lighting and good separation between foreground, middle ground and background provided both by the stereo rendering in the headset and the difference in the lighting tonality.
The same was also true for our implementation of Blueprints. Remaining fully aware that we may need to completely change or continually tweak any aspect of the gameplay at any time, we made effective use of object inheritance, blueprint interfaces and shared function libraries. Directly exposing many behavior related parameters on actor blueprints also really helped us to finely balance our gameplay quickly and effectively.
This test-driven development process gave us the flexibility and agility to quickly explore ideas and build up on them. Although we discarded 60% of ideas, it gave the team a strong sense of direction and an insight into what works, what does not. We will now enter the final phase of development, where we’ll select our art style, polish and perfect the demo, before unveiling it at VRX in a few weeks’ time.