Crystalline started with the core concept of showing the transition between 2D and 3D while showing a glimpse into the 4D world of scale. To provide an interaction that was fun and physical using technology to get people to connect with their body and their geometric intuition.
My goals with Cyrstalline were:
to level up my spatial controller design to include feedback mechanisms as well as a more sturdy and finished feel. To create a controller that was modular and could be easily converted to different control schemes including VR tracking, tethered and internally-lit controllers.
To integrate spatial audio into the experience and create an audio environment that engaged viewers at a deeper level than the visuals could provide.
To create a projection surface that expressed a sense of scale and would be free standing to allow me to display anywhere.
To level up the whole piece production quality.
To explore scale and dimensionality with the projection surface through forced perspective and projection mapping.
To act in more of a creative director role by working with others who had skills I didn’t to create a shared version of the piece.
Working with Brian Smith for the fabrication and Isaac Parker for the hardware, we created a prototype of the controller that would meet all the above requirements. The final controller supported VR tracking, haptic feedback, orientation and acceleration tracking and internal LED lights.
Even though the controller was leaps and bounds above the previous versions of these controllers it still suffered from some planning and design hiccups.
VR tracking took far longer than anticipated, because of that, it ended up consuming a lot of time allotted for other control systems. It was a risky bet that did not pay off.
The controller originally was designed with an “on/off” button, which due to its complicated internal nature seemed like a godsend. In actuality, many viewers using the controller kept pushing the button triggering a rather lengthy recalibration every time.
The Projection Surface:
Originally, the projection surface was in flux, the original concept was just “cube”. Originally inspired by the likes of Ouchhh and Deadmaus, this projection surface was meant to contain a volume or show dimensionality through the use of a cube. Some additional goals I had were for it to be modular and easily converted into other projection surfaces and to be transportable, able to be taken apart and relocated by one to two people with just a pickup truck. Most of all, to be affordable (a cube of LED screens is arguably not).
This part of the project went extremely smoothly and almost as expected. Next time, I will explore next time are getting the frame manufactured closer to home for budget concerns and working out the frame design with more time to spare before the installation.
The core driver for the interaction was to use the controller’s positional information to navigate a user through a 3-dimensional virtual cave. Essentially creating a spatial puzzle that the viewer would solve through a series of movements with the controller that were randomly generated per user. Using audio and visual cues, the viewer would navigate themself through the space to find pockets of structured patterns that would allow the viewer to scale up into the next dimension.
The visuals were designed to feel like the increasing complexity of the universe. Starting at atomic scale, the piece would shift through the molecular scale, into the cyrstal structure scale into real world scale by the end of the piece, allowing the viewer to navigate upward through scale and around in space. Squares and cubes were chosen as a visual base for the “orthographic similarity'“ of each shape would allow for a consistent feeling design throughout all the visual spaces and would allow for an easy perspective shift into the next dimension. The projection mapping was designed so that the visuals the viewer saw went from a forced-perspective 2D plane into an volumetric 3D cube; this was achieved with one projector using TouchDesigner’s CamSchnapper operator.
The audio was designed in layers that would be added to signal when these shifts upward in dimension would happen. The audio was also designed to be played through a spatial audio system and Envelop to inform the viewer which direction to head to find other elements of the piece.
The prototype was a success; glued all the pieces together to make something cohesive, large-scale and enjoyable.
The visuals, the game mechanics and audio integration still need a lot of attention before it will be complete. The visuals changed drastically once I got the frame and started the projection mapping. I had to rework how to convey the shifts in depth once I began projecting it on the frame. Having the frame or a small prototype of the frame earlier would have helped the visuals at a more pivotal point in their development. Audio setup is no joke; my audio setup plans fell through pretty late and I did not have the knowledge or familiarity with audio equipment needed to adapt at the last minute; this lead to falling back to a stereo sound setup. It is never too early to work on the game mechanics; I prototyped it pretty roughly early on in the development process and once I hit proof of concept, switched focus. Hitting snags in the steam VR hardware integration later on lead to a drastic simplification of the interaction. This could have been avoided if I spent more time on the game mechanics with the VR tracking as a stretch goal rather than the opposite.
Overall, Crystalline is in good shape and I’m excited to continue to polish and push the experience further.