top of page

Glass Animals’ Tour of Earth: AV and technology redefine live music

  • Writer: Live team
    Live team
  • 4 days ago
  • 10 min read


Late nights in the middle of June


Glass Animals’ Tour of Earth was a wholly immersive visual experience, merging retrofuturistic aesthetics with advanced video techniques


Words Verity Butler


Indie rock band Glass Animals first burst onto the scene back in 2012 with the release of their debut EP Leaflings. The band, comprised of childhood friends, was formed by lead singer, writer and producer Dave Bayley after creating songs on his computer while suffering from insomnia. The band’s unusual name was chosen by the simple random selection of words from a dictionary.

With steady and promising growth through their first two albums Zaba and How to Be a Human Being, it was hit song Heatwaves from their third album Dreamland that skyrocketed the band’s success – reaching number one on the Billboard Hot 100 in March 2022.


As with many breakthrough artists of recent years, the song’s popularity was catalysed by a lengthy period of being viral on TikTok. With an eclectic blend of electronic, psychedelic, indie, anthem and rock, this cocktail of genres that are peppered throughout their musical portfolio offers the band plenty of room for experimentation when it comes to performing songs live.

In 2024, Tour of Earth saw the group hit the road for 40 shows across the UK, Europe and North America. From real-time content synchronisation to Pepper’s ghost holograms and live camera feeds – with a touch of musical improvisation – the goal was to take fans on a surreal journey through space.


“It was an interesting project because they’re still a fairly independent band,” comments co-founder of Cassius Creative, Chris Swain, more commonly known as Squib. “But they do big shows in America, and it surprises people in the UK to hear that they’re able to shift 15,000 tickets a night in the US while still having to be mindful of budget.”



Inspired by seventies and eighties science fiction, the ambitious production – brought to life by Cassius Creative and Fray Studio – marked 80six’s debut collaboration with the Brit-nominated band. “Bringing Glass Animals’ vision to life at The O2 Arena wasn’t just about the tech – it was about creativity, adaptability and ensuring we had the right video gear to get it done,” introduces Samuel Siegel, racks engineer at 80six.


The brief featured complex visual demands, which in turn required a highly adaptable video solution that could scale across venues – from smaller spaces to arenas – without compromising quality.


Planetary pixels

Tour of Earth required a complex integration of creative and technical elements, where video, set and lighting teams all worked as one. It featured a 23-screen LED set-up with a large upstage LED wall (18x9.6m) and smaller screens embedded within the set, including 16 mini-CRT-style monitors.

Real-time synchronisation across 23 video surfaces – the upstage LED wall, hologram, mini screens, 360° ticker tape, computer monitors and ceiling – required precise coordination.


“The goal wasn’t just to capture the show, but to make the Roe LED wall feel like a part of the performance,” Siegel explains. 80six supplied premium LED technology, which included Roe Visual MC7 and CB5 screens paired with Brompton Technology LED processing. LED Creative was also brought in to provide scenic LED elements.


“At the heart of it was a large header piece with over 70 moving heads,” describes Tim Rees, senior project manager at LED Creative. “Each one was in a vacuum-formed pod that had LED around the top to create a halo effect, while the header also had large, slatted front panels with LEDs and a lightbox that ran the perimeter.



“This design was mirrored on the stage as well. In addition to these two main elements were lots of scenic details, like the consoles which were designed to look like the inside of a spaceship. As well as lighting fixtures and screens they had lots of subtle up lights, down lights and light-up buttons and keyboards, allowing movement and interest in all parts of the set.”


A standout feature of the tour’s set design was the Saturn-inspired planet that took centre stage throughout the performance. “The largest request by far was the centrepiece – a gigantic planet, including planetary rings,” adds Rees. “LED Creative provided the construction of the rings, plus a fully wireless pixel LED solution along with them. The piece was all on a single motor so it could drop down into the middle of the stage.”


“The final header piece has to be one of my favourite elements of the show,” Rees emphasises. “Its versatility is incredible, having over 300m of Flex-It, and the lightbox around the edge is so bright it was producing a beam of light. All in all, it’s a super-impressive set piece.”


Intergalactic ecosystem

“Ed Coleman, our video director, had a really clear vision,” Siegel continues, “he wanted to be right in the middle of everything, directing from the front of house (FOH), while being surrounded by the audience. It all had to be seamless – both creatively and logistically.”


To ensure this, Coleman needed to have visibility on every angle through monitors and multiviews, all while being right out there with the crowd. This ultimately led 80six to select Blackmagic Design for the core set-up, known for a holistic live production portfolio that enables the complete synchronicity Coleman was seeking.


“The Atem 2 M/E Constellation 4K switcher enabled us to create multiview layouts, which was key for Ed’s FOH needs. I set it up so he could have a 4:3 crop of the program feed alongside the regular 16:9 view – this way, he could see exactly what he needed, precisely when he needed it. Running the Atem 2 M/E Advanced Panel over Ethernet meant the main gear could stay backstage while Ed controlled everything from FOH – even if it was over 150m away.” Choosing Blackmagic across the board wasn’t just about making things work technically, it also made everything much easier operationally.



Siegel stresses that “the fibre system was a big part of that. We had to send video, comms and monitoring signals across the O2, which is a massive venue, and the reliability of their fibre set-up made it possible. We also made sure to build in redundancy by having multiple fibre paths, so if anything went wrong, we had backups ready to keep the show going smoothly.”


As with any tour of this scale, there were several hurdles, both practical and creative, to overcome.

One of these was lighting. “There were no neutral colours. Every song had its own intense lighting set-up, so keeping the visuals consistent and sharp took a lot of finesse. With Blackmagic’s RCP controllers, I adjusted each camera’s hue and saturation individually, ensuring everything stayed coherent between the dramatic lighting changes.


“I also used Blackmagic’s waveform monitor and vectorscope tools to match colours across all cameras, ensuring they stayed balanced despite the constantly shifting lighting conditions.”


With the lighting and cameras secured, the next stop was choosing the lens. “We paired the Ursa broadcast cameras with parfocal broadcast lenses like the Canon UJ90, which allowed us to zoom in and focus precisely without losing sharpness.”


Siegel stresses just how critical this selection was for the faster-moving segments of the show. Maintaining focus while zooming ‘made a big difference in those dramatic shots’.


The show demanded a healthy range of camera models to suit specific modes of capture. “We had PTZ cameras fixed on the drummer, bassist and pianist, which Ed controlled from FOH for dynamic, jib-like movements.


“The Ursa broadcast cameras handled dramatic, close-up shots of the band and the audience. We even had a Blackmagic Micro Studio Camera 4K G2 situated right by the stage to get those unique angles that added depth to the visuals. The PTZ cameras were wirelessly controlled, and minimising latency in such a crowded RF environment was crucial to ensure smooth movements and responsiveness for the live show.”



The technical dots were then joined via Blackmagic’s Videohub routers and SDI-HDMI converters. “We used the converters to manage signal integrity, converting SDI to HDMI where needed, especially for the director’s monitors and other outputs. The Videohub also made sure that everyone – from the director to lighting techs – got the specific feeds they needed, with redundancy built into the routing paths.”


80six also supplied a Disguise GX3 media server package to use in the post-processing. This allowed Siegel’s team to weave in real-time effects, giving each song a distinct visual aesthetic that matched the band’s artistic intent.


“The IMAG screens didn’t just show what was happening,” Siegel expands, “they were part of the performance. We also managed to keep latency to a minimum across all the various camera feeds, which was crucial for maintaining synchronisation with the live action occurring on stage.” Powerful graphics cards in the Disguise server helped here, making sure any post-processing effects were applied instantly – and without introducing noticeable lag.


“To maintain audio and video sync, especially for such a large venue, we ensured that the various video feeds transmitted over fibre were perfectly aligned, and we used the Atem mixer to distribute audio feeds.”


Solar settings

Ox Event House was brought on board to deliver the show’s spacey set – known for its market-leading expertise in stage set and prop manufacture.


Having worked with Glass Animals’ production manager Simon Lutkin previously (creating aluminium staging that incorporated giant pineapples and cacti, no less), the band’s growth in popularity meant bigger shows and more ambitious prop and set design.


“We were presented with a design from Cassius Creative, who we’d worked for before on an American arena tour,” begins Ben Levitt, technical director at Ox Event House. “The design featured a huge, suspended lighting pod with 72 moving lights in it and formed funnels. There were lots of helical angles to work with, which was especially challenging engineering-wise when it came to getting the weighting and rigging right for every different venue.”



The Ox Event House team immediately set to work, getting in touch with a structural engineering company it had worked with regularly since the late nineties. They started talking to a rigging company, “to make sure we had enough motors and ensure that the set team were going to be able to clip it together quick enough every day to achieve the look they wanted,” Levitt adds.


“The floor system was an exact mimic of the system above, so lining them up and getting it perfectly right was really challenging – but equally as exciting.”


Levitt also comments on the hands-on approaches that were required from Lutkin, Cassius and 80six to integrate video into the performance.


“The circular video was one of the main structural elements, which presented the unique task of dressing it with the right depths for the LED Creative team to integrate their system.”


Cosmic creativity

Another of the show’s stars wasn’t the band itself, but the Pepper’s ghost holograms featured in a dome, bringing an extra dimension to the set and enhancing its galactic feel.


The Pepper’s ghost effect was realised with the use of two LED panels on a polycarbonate-formed dome, projecting animated holograms of dolphins and pineapples – tying into the tropical feel of the band’s Pork Soda track.


The show’s climax featured Denzel Curry appearing as a hologram inside the dome, which allowed for a live collaboration – despite his physical absence. The hologram dome itself was inspired by animated sitcom The Jetsons.



“We were the team behind the dome’s physical structure,” Levitt continues, “creating the base, the cover and key structural elements of the Pepper’s ghost, which was set at upstage centre.”

For the visuals themselves, Fray Studio was recruited to deliver the content that was displayed on all 23 of the unique video surfaces featured across the show’s set. These displays integrated real-time and pre-rendered content, with the band’s instruments sending MIDI signals to directly control all the visual elements.


The MIDI-controlled instruments were synths, electronic percussion and keyboards. Using Ableton and QLab, Fray were able to condense the information into simple on and off notes, translating into one and zero. Utilising accumulators, condition modifiers and a little bit of maths in Notch, they were able to create visual representations of the beats, as well as creative accents and nuances corresponding to specific rhythms and notes. The content was as dynamic as the musicians on stage.


“So much of it was real-time interactive content that primarily came from the fact that the band don’t play to time code or backing tracks,” says Adam Young, co-creative director of Fray Studio. “This approach means playing live and, consequently, slightly differently each and every night. Any pre-rendered songs were split up sometimes into 25 individual sections that would be manually triggered at the right musical moment by the operator. The rest was created in real time, and was largely controlled directly by the band.”


The initial brief and the creative development included many references to sci-fi figures, silhouette tunnels and outlines. During the process of creative development, the Fray team soon discovered that Nvidia background removal was the most effective way to achieve this vision for camera content. This technique allowed edge detection to pick up clean edges and create an androgynous-looking figure on screen. From this, the team could then generate tunnels using cloners, which led to the chorus look for song On the Run.



The term retrofuturistic might sound like a juxtaposition, but the team at Fray accomplished the creative quota largely thanks to its age-diverse team. “I had a team of 11 people working on this, and it was a real mixture of people who were in their early twenties along with some people who were much older, bringing with them decades of experience. It was so nice to work both with people who understood the references from their original sources along with others who would be looking at them from an entirely different perspective. I believe that this led to something that felt truly authentic – but also brand new at the same time.”


Cinematic techniques have been seen taking increasingly central roles in the production of live performances in recent years – from theatre shows to arena tours – and Fray is no stranger to deploying these kinds of approaches to its visual content.


“It’s the first show in a very long time where we’ve used such a wide range of tools,” Young explains. “We used Unreal Engine, Touch Designer, Notch – and we also incorporate lots of Midjourney and AI elements. We took each track and distilled down the idea, then figured out what the best bit of software would be for producing that element. It was challenging to figure out how we could make visual effects from many types of software feel like they were still coming from one world.”


Young emphasises how the band’s frontman Dave Bayley was especially open to trying new things, “letting us push further than other artists would usually let us push.”


“Dave was the creative powerhouse behind all of it,” concludes Squib. “They came to us with an almost social art concept, which was to measure the size of humanity within the wider realm of space. He essentially wanted a spaceship to take the audience on a journey.”


Creative team


This feature was first published in the Mar/Apr 2025 issue of LIVE.


Latest posts

LIVE_MAR-APR_2025_01.jpg
LIVE
NEWSLETTER SIGN-UP

Get the latest updates from the world of live audio-visual technology.

 

Subscribe to the LIVE newsletter and you'll also get the latest issue of the magazine straight to your inbox.

By providing your information, you agree to our Terms & Conditions and our Privacy Policy. We use vendors that may also process your information to help provide our services. You can opt out at any time.

Follow us

  • LinkedIn
  • X
  • Instagram

© 2025 Bright Publishing Ltd

bottom of page