I was very fortunate to have had the chance to work on Lego’s Hidden Side franchise as part of their core design team. I was brought on when the project was in its infancy to provide visual and technical support to their designers. This involved prototyping concepts to take into focus testing, establishing visual themes, designing and implementing various visual mechanisms to support design concepts.
This project had a range of challenges, but top of the list was that everything had to function on mobile hardware within the technical constraints of AR. At the time, this equated to the use of Vuforia as ARKit and ARCore had not been officially released. Thankfully, the minimum spec device was considered an iPhone 6 and the latest version of Vuforia supported OpenGL ES3 and 3.1 so we had a little more graphical horse-power to play with. Some of the concepts explored didn’t make it into the final proof of concept though due to these technical limitations. We used Unity for all the internal prototyping and I worked closely with an on-site programmer at Lego who handled all the game-logic and AR side of things.
One of the main concepts I had to explore was the ghost catching mechanic and the initial idea from the design team was to have some form of “light-tether” that the player could aim at the various ghosts in the game in order to capture. The first video below shows the implementation I did for this.
Another feature of the game that I had to establish a visual treatment for was that of ghost and point of interest discovery. I cam up with a number of custom shaders as well as a post process effect that would reveal things hiding within the environment.
Catching ghosts required the user to deposit them in some form of containment area, which required its own visuals. This video shows what I came up with, which is a combination of shader and particle effects.
The environment points-of-interest needed sign posting to the player, so I implemented a solution that would “stick” the player reticule to things when within a given range of world-space distance and screen-space radius.
As ghosts wandered around the environments, they would leave a trail of infection called “gloom”. This would increase in intensity over time and it was the players job to clean this up. I built a Houdini tool that would generate a single geometrical shell of the base model based on a VDB representaion. This shell was then poly-reduced and broken into smaller chunks using a voronoi operator. I developed a simple C# script in Unity that would activate these shells based on a given ghosts proximity and a custom shader gave it the final appearance. I used an optimised dithered blend for the shader to keep it as efficient as possible on mobile hardware and it performed very well.
Another simple tool I built in Houdini was a cloud model generator. The environments used a mixture of accurate lego models, as well as abstract and non-lego like models. The clouds had to look more organic than one built from lego components, but not fluffy or cartoon-like.
A brief RnD period was spent on what the background camera AR feed might look like. Since the player was supposed to be looking into the lego world using some form of technology we thought it might be nice to stylistically treat the video feed in some manner to help reinforce this. I played around with running a sobel edge filter on the incoming video via a custom shader. It looked pretty interesting. Since this was captured from desktop I piped my webcam feed in using Unity’s desktop AR simulation, which you can see in the following video.
Later on in the project cycle I got to work on some gameplay elements. Small lego modules that the player would place in the physical world would be recognised by Vuforia’s object recognition and spawn corresponding game modules in AR. These modules would aid the player in capturing and defeating ghosts and other enemies in the game. I built a flexible component system that could be used for any of these modules, giving designers access to various states and actions that could be performed with corresponding visuals. In the following video, I also implemented 3 example modules as requested by the design team. A bomb throwing damage module, an ice throwing movement slowing module, and an electrical stun and damage over time module. The video was captured as one of my daily check-ins which I think is a good example of how I typically communicate with clients and it also features yours-truly!