[Monster or Friend] Making of WebVR experience.

For those who are curious, Kif and I thought we would share some behind the scenes of our first WebVR demo, Monster or Friend.

This project was created using A-Frame and three.js in about one week. To be honest, I am generally not a framework person and try to stay as low level as possible while still humane (no openGL lol) - but using A-Frame was probably the right choice for this project.

A-Frame gives you a working controller in just one line, including the controller model, positioning, and highlighting user interactions on the model itself. Also, I was able to open this demo on Chrome, Oculus Go, and my Pixel 2 phone and the views / controls just worked, and this was really cool and something that I wouldn’t want to program from scratch. I also appreciate that A-Frame is an Entity Component System, which may take some getting familiar to if you are used to coding three.js examples in one html file.. but I think it is worth it for organizational purposes.


Game screenshot with purple theme
Game screenshot with purple theme

The creature was the starting point of this demo. I have been interested in procedural animation in the last few months, and contributed a little to this repo that implemented inverse kinematics for three.js. I wanted to play around more with this library, so I thought I’d make a 25-legged sea / space creature : )

Inverse kinematics is really awesome, it adds a complexity that would be very hard to manually animate. Every limb of this creature is one bone chain, with a target point ( the head ). Inverse kinematics reverses the forward kinematic equation to determine the bone positions and rotations such that the last bone reaches the target point at every step.

Inverse kinematics example from three.ik
Inverse kinematics example from three.ik

After playing around with randomized target point movement, a tweening library called tween.js, and the forever handy sin function, I found a pretty organic effect that brought the creature to life.

Making a shader for the creature was the next step, I blended some colors along the UVs, changed the colors over time, and used an opacity map to add complexity.

My creature was ready for the world.


Colors are really hard, and also really important for the purely visual demo that we were building. At this point, we had a nice environment with a spacey sky sphere, little dust particles, and the creature. We decided to consolidate these elements with color themes of 7 colors: 2 colors for the sky, 1 color for the particles, and 4 colors for the creature (it transitions between two states).

We organized these colors into one color component that would be added to all elements of our scene, and then got carried away for a few hours playing around with the endless color possibilities. We used this color palette generator for initial inspiration, sampled some colors from images we like, and tried the colors from our website logo :)


I mentioned before that the A-Frame controller was awesome because a lot of the functionality was built in… but not all of it. Since movement based on controller input is not easily defined in VR, the A-Frame devs decided not to include it as part of the library. However, it was not too difficult to define our own movement schema, where the user can move forwards and backwards with the trackpad, and change color theme on trigger.

Screenshot of game controller tutorial
Controller helper in VR

Another thing we thought it was important to add was a tutorial for the controller, to show the users how to move around the scene. We made this 3D Text in HoudiniFX, and added a subtle fresnel shader to the controller. This tutorial hovers in front of the camera until the users first interaction.


We felt like the creature might need some entertainment in its new home, so we decided to add in some music to its world. I was dreaming of something ambient and deep, and decided to create it in FoxDot, an awesome framework to make music using Python. I used simplex noise to drive the ambient sounds, and recorded it with SoundFlower and Quicktime. Bellow is a little clip from the process.


The experience

Source code

Other projects