A collaboration for his new solo show at Plan X Gallery Milan, a discussion with Ali about how to incorporate a screen in to one of his fantastic model builds led to us co-developing this piece.
For the technical build, I needed a fire simulation that would run well on a Raspberry Pi, so due to the performance restrictions I built a pipeline around video playback. The fire and smoke simulations were built with Embergen, and exported as an image sequence, which is then comped in AfterEffects and rendered as mp4.
My python script then blits all of the video frames in to the framebuffer, to minimise any hitching when swapping videos or looping. A Time of Flight laser sensor detects when anyone approaches the sculpture, and switches from the smoke to the flames. For SFX, to minimise memory usage, a single looping effect plays all the time, changing in volume when someone approaches and also playing a one shot SFX.
