A hypnotic plunge into the meditative state triggered by the music of Brussels based trio Piloot, Gyre sees Piloot flute player and 3D artist Florian Guibert, marry his musical and visual skills for an ephemeral CGI journey. Celebrating the release of the band’s new five track studio album of the same name and making its premiere here on Directors Notes today, we invited Florian to reveal how his intimate knowledge of the Krautrock band’s music acted as a creative guide for the visual experimentation of Gyre.
– A heads up we suggest watching Gyre in 4K for the full visual experience!
The starting point of the idea for GYRE was to make a music video for the upcoming album of my band Piloot. Thus I was both the musician and the 3D artist. Being part of the creative process of the music itself, I have a close relationship with it. The interaction between music and image has always been a center of interest for my creative process. I see images when I hear music. Using that music as a guideline is something I’ve been experimenting with since the beginning of my graphic practice.
The interaction between music and image has always been a center of interest for my creative process. I see images when I hear music.
I’ve been learning the procedural 3D package Houdini from SideFX for the past 2 years. I’ve been a freelance 3D artist for more than 15 years and this software has really allowed me to bring to life ideas I’ve had for a long time. GYRE was the result of one of those ideas. The Houdini online community is very active, enabling me to find the tools I needed. I’ve always been intrigued by patterns in nature like the ones created by whitewater on the ocean. Luckily Houdini is the right tool to handle such things.
The music was a means to frame and guide my experiments. As I didn’t have much time on my hands to work on the piece – the whole piece was made in 6 weeks – and the track was 8 minutes long, I had to find a way for the machine to lift the load off the animation.
Under the hood, there’s a particles system driven by a physic simulation where I had set up my own forces and collision objects. I could animate those 2 elements along to the music and see how the simulation would react to it. I’d adjust that animation until I’d get a result I was happy with.
That stage was the core of the creative process as I couldn’t predict what the result of the simulation would be. These accidents were an excellent source of interesting patterns that I hadn’t foresee. I ended up with an 8 minute long simulation interacting with the music.
I used Redshift’s abilities to handle geometries and points on the GPUs directly so I didn’t have to generate the mesh from the wireframe before rendering it. I worked on the final look before getting into animating to make sure I was happy with it so I knew how it would look roughly at the end. Thanks to Redshift’s speed, I didn’t have to downscale my previews and so worked in full-HD throughout the process.
The music was a means to frame and guide my experiments.
The next stage was closer to a normal film director filming a subject. I placed cameras and shot rushes, ending up with different angles of the same animation. I edited the shots accordingly afterwards. This is a different approach to what I would normally use for CG production where I’d frame the shots as close as possible to the final edit as render time is always an issue. I used GPU based renderer engine Redshift3D which allowed me to render frames very efficiently on a quad GPU graphic workstation.
At the compression stage, I had technical problems keeping the details of fine lines with close shades next to each other. The only way I found around that was to upscale the full HD to 4K and upload the piece in this resolution.
I’m constantly searching for new ways of creating unconventional images, my next work might be made for printing, I’m still thinking about it… we’ll see!