Finalist 2025

Luke Howard & Simon Burgin | Immersive A/V Performance

Simon Burgin / Luke Howard / Melbourne Recital Centre

Immersive live A/V performance blending real-time digital art, artificial intelligence and music at the Melbourne Recital Centre

Award-winning composer Luke Howard and digital artist Simon Burgin came together to present an ambitious immersive audio-visual experience at the Melbourne Recital Centre - integrating performed real-time digital art with artificial intelligence and live music, transforming the iconic Elisabeth Murdoch Hall.

Design Brief:

This project was a self-initiated collaboration between the artists, presented as a co-production with the Melbourne Recital Centre. It was our third performance at the venue, and we set out to create a fresh immersive experience that could connect deeply with the audience.

Whilst we intentionally expanded on concepts and learnings from previous shows, our outcome would be a new live audio-visual performance that reinterpreted Lukes music with Simons art and responded to the unique architecture of the Elisabeth Murdoch Hall’s rear-wall facade.

For this iteration, we also introduced AI-generated imagery into the production pipeline. It was a bold experiment - one that definitely brought an element of unpredictability and expanded the visual language of the performance. We hoped the result would push us into exciting new territory and re-imagine what a real-time digital performance could be.


This project was developed by:

Design Process

Given we were experimenting with the introduction of artificial intelligence into an already technical and bespoke procedural art pipeline, the design process for this project was deliberately exploratory and iterative. Our familiarity with the venue helped shape the creative direction. Having previously 3D-mapped Elisabeth Murdoch Hall, we were attuned to the space, particularly its iconic wood-paneled interior designed by ARM Architecture. Rather than treating it as a limitation, we embraced it as both a constraint and a creative canvas for our projection-mapped content. Despite the experimental nature of the project, the design process was still grounded in a fairly structured method. Initially Simon developed early visual sketches and procedural systems using TouchDesigner, testing and refining ideas in response to the music. Some visual systems evolved over time; others were scrapped entirely. Each system was built with controllable parameters, enabling Simon to manipulate the designs live via a DMX console. This ensured spontaneity and unpredictability, so no two performances would ever be the same. Shared audio-reactive systems and modular visual components enabled consistent interpretation of musical dynamics across the performance visuals.

A key part of the design process was the integration of a Stable Diffusion model into the real-time pipeline. As an open-source text-to-image AI, Stable Diffusion allowed Simon to generate stylised visuals on the fly. To ensure consistency and avoid unwanted or jarring visual outputs, the AI was carefully constrained using curated prompts and visual guidance mostly focused on painterly and geometric styles. Given it was taking guidance from the Simons designs and prompts - the result felt like a live conversation, an interplay between crafted digital design and generative AI - occurring simultaneously.  Our approach produced a suite of ten distinct real-time digital artworks, each one designed to enhance and respond to the emotional landscape of Luke Howard’s compositions.

Design Excellence

This project exemplifies design excellence through its novel integration of design elements in a live audio-visual immersive performance that is both technically refined and artistically ambitious. In terms of user experience, the visuals didn’t just accompany the music, they deepened the connection between performer and audience, translating sound into a responsive visual language that invited emotional engagement.

At the heart of the project was a creative application custom built in TouchDesigner integrated with a DMX console that let us perform ten distinct digital art systems live. These designs integrated tightly controlled real-time visuals with AI-generated content - a cohesive visual language that moved in sync with the music. The AI component, powered by Stable Diffusion, was used with care and restraint. Rather than dominating the visuals, it was subtly and intentionally woven in and out - throughout the performance. We were careful to build in overrides to guide its aesthetic output, ensuring the AI-generated imagery matched the tone and sensibility of the digital artworks. Each artwork had associated written prompts - that could be called on via a table during the performance. This allowed us to generate painterly, geometric, or atmospheric visuals that felt organic (often indistinguishable) from Simons artworks.

In terms of functionality, each artwork system was designed to be parametric, allowing cues and manipulations to be triggered in real time, directly responding to the musical dynamics of the performance. All artworks were unified within the creative application to provide stable real-time playback, DMX control, and seamless integration with the venues A/V systems. The result was a robust setup that performed reliably in a high-stakes performance environment. We believe this project sets a strong benchmark not only for what’s possible for digital arts in Victoria but also for how emerging technologies like AI can be integrated meaningfully into live performance.

Design Innovation

While this wasn’t our first performance involving live parametric digital art, it was definitely one of our most ambitious in terms of both visual design and system architecture. The performance brought together a full palette of innovative digital art elements, including 3D-captured point clouds, recursive feedback loops, procedural fluid simulations, complex real-time visual compositing and effects, and architectural projection mapping.

Everything was programmed to be parametric - allowing for live control and manipulation. A significant innovation was integrating a real-time AI image generation stream. This was made possible through utilising a custom TouchDesigner component developed by developer Lyell Hintz, which gave us a front-end interface while running the Stable Diffusion computing process in the background. We enabled acceleration on the model so it could generate images fast enough to produce smooth animation.

A key concern in any live show is the user experience and their sense of immersion. That would be easily broken if the system lagged or failed at any point. Stability was critical. We needed the entire system to run at a consistent frame rate without missing a beat, which meant careful resource management and optimisation. For example, in one section of the performance where the generative visuals were especially demanding, I temporarily paused the AI image stream to give the system breathing room, then resumed it later in the performance. It was a balancing act, but it worked! The final result maintained a smooth 60 frames per second, with Stable Diffusion running close to 30 frames per second. This delivered a cohesive, fluid experience that never gave away the complexity running under the hood.

Bringing AI into a live performance with real-time responsiveness and visual cohesion is still relatively new. We believe this project pushed the envelope and opened up exciting new territory for us!

Design Impact

The outcome was highly successful. We sold out the Melbourne Recital Centre and received fantastic feedback from the audience. Interestingly, many audience members later said they didn’t notice the AI-generation at all - but some were aware of something else underwriting the visual outcome and providing greater complexity. From my perspective, this was a huge mark of success. Like a sleight of hand in a magic trick, the subtle and restrained integration of AI into the overall visual system meant it enhanced the performance without drawing attention to itself and without detracting from the human crafted elements of the digital artworks.

From a production point of view, the visual system was lightweight and efficient. Everything ran from a single custom-built application with a DMX console, making it scalable, tour-ready, and low impact in terms of environmental footprint.  We believe projects like this help position Victoria as a leader in innovative digital performance, showing how emerging tech like AI can be woven into live work in a thoughtful, creative (and hopefully ethical) way. Most importantly, it highlights the value of hands-on experimentation and a pragmatic design ethic. The show opened up fresh possibilities for how we think about live music, digital art, and technology working together.  Also as a cheeky side note, Luke wrapped up the performance with an encore of Kraftwerk’s Computer Love - a fitting nod to our AI contributor.

Digital Design 2025 Finalists

444.2 | Fashion XR

444.2 | Fashion XR

Electric South / Meta / imisi3D / AfricaNoFilter / Depthkit / South African Astronomical Observatory , The Southern African Large Telescope (SALT)

Madam Speaker

Madam Speaker

Your Creative / The Victorian Women's Trust

Open Electricity

Open Electricity

Open Electricity / The Superpower Institute / AKQA

Mouthful of Dust

Mouthful of Dust

Creative Studio, State Library Victoria

Future Naarm: First Light

Future Naarm: First Light

Superscale / RMIT A&UD Immersive Futures Lab

Service Victoria Savings Finder 2.0

Service Victoria Savings Finder 2.0

Department of Government Services / Department of Premier and Cabinet

National Communication Museum - The Blades

National Communication Museum - The Blades

National Communication Museum / Studio Peter King / Grumpy Sailor / Show Works

Florence: Lighting the Way for Nurses

Florence: Lighting the Way for Nurses

The Australian Primary Health Care Nurses Association (APNA) / Liquorice

Trauma-Informed Training on Forced Adoption

Trauma-Informed Training on Forced Adoption

Beth Hyland, Lead Design Strategist / Willhemina Wahlin, Senior Design Strategist / Adam Corcoran, Principal Design Strategist / Tess Waterhouse, Senior Producer / PUR Production / Department of Social Services