Avenged Sevenfold UE5 Video ‘Surreal’, 3D Artist Says

Avenged Sevenfold artist Ryan McKinnon explains his workflows for creating music videos on Unreal Engine

5
Avenged Sevenfold Unreal Engine 5
Mixed RealityLatest News

Published: May 31, 2023

Demond Cureton

Immersive and 360 filmmaking have gained popularity as musicians and artists turn to extended reality (XR) to engage with audiences. Avenged Sevenfold, one of the most outstanding heavy metal groups to date, has begun its Web3 journey with a new metaverse platform and music video featuring 360-degree immersive content.

XR Today interviewed Ryan McKinnon, Freelance 3D Artist and creator of the immersive video for the new Avenged Sevenfold single, “We Love You.” The new hit track is featured on the upcoming album, “Life Is but a Dream,” which is the band’s first release after a roughly seven-year hiatus.

He discussed his experiences with developing workflows on Epic Games’ Unreal Engine, the future of immersive filmmaking, and his creative processes to create immersive media.

Kindly view the music video with a smartphone or virtual reality (VR) headset.

XR Today: How does it feel to create this video for Avenged Sevenfold? What kind of vision did you want to communicate with audiences?

Ryan McKinnon: I used to have the band’s t-shirts on my wall, and it’s surreal to create something like this, especially for a band like them.

It started with hand drawings to get my initial ideas out, beginning from the middle of the video, weirdly. It began with the idea of the shifting levels of society, [emanating] from a house. [The video] originally didn’t have the field or nature, but I worked backwards and created the idea by starting from the middle.

XR Today: How easy was it for you to use Unreal Engine, and how did it facilitate the creative process for developing the video?

Ryan McKinnon: It’s worlds easier than it would have been just a few months ago. It’s pretty incredible. You have [around] 20,000 assets at your disposal, just right off the bat, when working in Unreal. They’re all great photo-scanned assets.

They also have a marketplace full of great assets the community makes, and they host free items every month. If you have an account, it’s always good to hop in there and grab the free items every month.

It’s the craziest programme I’ve ever used. I used to play a lot of video games, and now I play around in the place where video games are made.

XR Today: We’ve seen artists like Fatboy Slim, VNCCII, Snoop Dogg, Eminem, and others connect to their audiences using gaming engines like Unreal and Unity. How does connecting in immersive spaces compare to traditional methods of engaging with audiences?

Ryan McKinnon 3D Artist
Ryan McKinnon, Freelance 3D artist and creator

Ryan McKinnon: [Initially] I was exploring the 3D and 3D 360 visions for a personal project I was working on. That’s how these projects always go — I explore something for a personal reason, get excited about it, share it, and then work on my next project somehow.

I think it’s cool that the audience can choose where they are looking throughout the video, rather than setting it exactly where the viewer is. That was also a little unnerving, knowing that everyone can look at any point for as long as they want, rather than where I typically cut music videos.

It’s so quick that, if you want to hide an error or something, you can cover it up with a quick cut. However, [3D videos are] definitely different, and I have to consider that they’ll have to look at this for the entire time, so we had better make it interesting.

XR Today: How did you react to the finished product once you viewed it? How did Avenged Sevenfold communicate with you through the creative process?

Ryan McKinnon: It was big, and the first time I put on the headset, I had a huge smile. That was a very good sign, and I feel like having the video pre-recorded this way allows the headset to have higher fidelity. That was a weird, new, and fun way to make art.

I pitched a similar idea to the band, but they offered a chance to do this video instead. I did work partly from the other idea 360-degree visuals, but later took the concept and fleshed it out on paper. Then, I texted [the band] with just a few sheets of paper and stick figures [to get feedback].

About two or three days later, when I had a visual pitch, I designed it in Blender and used a website for building skyboxes with artificial intelligence, full of 360-degree images.

Blockade Labs is a Stable Diffusion-linked programme, and it essentially built my base. The viewer would stand on Blender and have multiple changing environments around the viewer, using AI. I couldn’t build the environments quickly enough to pitch the idea.

I could then speed up the workflow by representing ideas through eye images, and afterwards, we started working right away. We rendered the video once a week to feel where it was and exchange notes.

We constantly checked on the video because it runs for six minutes, so there’s much to check. A few people also helped me test the visuals.

XR Today: How easy was it to port the immersive video to Avenged Sevenfold’s live performance at Area-15?

Ryan McKinnon: It was a process [and] I was initially a little lost. You have to map each wall as if it was an individual video, and I’ve never done that before. I set up a pretty cool rig in Unreal Engine, where I had the pixel map, similarly like in cameras, set to the exact size.

Avenged Sevenfold XR concert Area 15
Images from Avenged Sevenfold’s concert at the Area 15 venue in Las Vegas, Nevada. The band performed songs for its upcoming album release, ‘Life is but a Dream’. PHOTO: Avenged Sevenfold/ Melissa Libertelli

That way, I could move my camera and render it up to [superimpose the content] on that wall. At first, it was a little confusing, but once I got a good pipeline down, it was very cool.

XR Today: What do you think about the future of immersive performances? Will they become a new medium that artists can use to express their music?

Ryan McKinnon: I think there’s a lot of potential, not just with animation, but also with creating greenscreen content with actual people in the video. I’ve been doing a bit of that over the last few months. It’s making greenscreen [content] blend with the real and Unreal environment.

It’s a challenge, but I think there will be a huge, flourishing community of people creating in that pipeline it’s becoming.

There are a few apps that you can use to decently key out any greenscreen or non-greenscreen content, and then toss that into Unreal Engine where, now, that person’s hanging out on whatever steam they’ve built.

I’ve seen a few people use motion capture suits to move between their real-life bodies and mocapped actor. I think it’s super exciting, and getting 3D programming under anyone’s belt [would be beneficial to those] interested in the filmmaking world.

I come from a purely camera-based world, and both the camera and Unreal Engine change the lighting, depending on whether you change the shutter speed or aperture. You could learn how to use a camera in Unreal, a free programme — the most mind-blowing thing.

For more information, kindly visit Ryan McKinnon and Avenged Sevenfold‘s websites

Entertainment

Brands mentioned in this article.

Featured

Share This Post