This week, we review a supporter-recommended iClone fantasy machinima that surprised us with its polish: “Quest of a Key – Chapter One” by AuroraTrek. We’re always saying we want more story-driven iClone machinima (and fewer tech-demo vibes)… and this one delivers on craft: strong shot selection, confident editing, excellent music cues, and character animation that’s smoother than you’d expect.
But then the conversation gets interesting.
We dig into sound mastering and spatial audio, the difference between “dry” dialogue and believable room tone, how stylized realism can drift into “clay-face” territory, and what happens when a series leans hard into character introductions without giving the audience enough plot hooks to chase. Tracy goes deep on the structure across multiple chapters, and we talk about why view counts can drop when episodes feel like long-form animation sliced into shorts.
We also get into pipeline talk: Daz characters into iClone, motion capture vs animation libraries, and the very real challenge of stepping from an established fan universe (Star Trek / Star Wars) into an original world where you don’t get story shorthand for free.
If you make machinima, virtual production, iClone films, or Unreal/CG shorts, this ep is packed with practical takeaways: pace, hooks, sound space, visual texture, and how to reveal character through action inside the plot.
Audio Only Version of this Episode
YouTube Version of this Episode
Show Notes & Links
Quest of the Key: a Fantasy Adventure Series – Chapter 1 – “Help Wanted” by Auroratrek, released 3 Feb 2023
🎬 This week on And Now for Something Completely Machinima, we’re shaking (and stirring) things up with a deep dive into Benjamin Tuttle’s long-awaited James Bond machinima, Endgame – Part One 🍸💥
Host Damien Valentine kicks things off by revealing he actually voices Q in the film (recorded years ago!), before the panel digs into why this project is such a standout. Created in iClone and rendered in Unreal Engine, Endgame delivers a Bond look and feel that’s grounded, stylish, and refreshingly not sci-fi flashy—London actually looks like London, and the tone leans classic rather than futuristic.
🎶 From its full-length Bond-style title sequence and original theme song to slick action choreography, witty humor, and loving nods to Bond lore (Spectre, Q, M, Cold War vibes, and yes—the car), we agree: this is a heartfelt homage made with serious craft. There’s also a touching dedication to Ken White, honoring the machinima community that helped shape projects like this.
Of course, no good Bond briefing is complete without critique 👀 We debate storytelling clarity, episodic structure, sound mixing, facial animation quirks, and whether Part One leaves us with enough of a cliffhanger to fully ignite anticipation for what comes next.
🎤 Along the way, we talk:
What makes a Bond feel like Bond (without copying the originals)
Machinima’s evolution as a filmmaking medium
Unreal Engine vs iClone (and why skill matters more than tools)
Why this project is a major proof-of-concept for solo creators
💡 Bottom line: Endgame – Part One is ambitious, polished, and packed with love for both James Bond and machinima—and it sparks a lively, thoughtful discussion you won’t want to miss.
👉 Grab your martini, hit play, and join us for one of our most energetic episodes yet.
This week on the podcast, we’re diving into a grab-bag of big creator news, starting with YouTube, and yes… the “slop” situation.
Tracy kicks things off with what looks like YouTube’s latest attempt to clean house: platform changes that claim to improve privacy and the viewing experience, but also mess with how videos behave when embedded on third-party sites. If you stream shows inside places like Second Life, that’s a real headache, because some embeds and API-based workarounds are suddenly unreliable or broken.
But the bigger story? YouTube appears to be cracking down on the explosion of low-effort, mass-generated content. The buzz is that Gemini is being used to evaluate whether videos look human-made, original, and honestly presented – plus there’s talk of internal “trust scores” that creators can’t actually see, but which may influence how channels are treated behind the scenes. Tracy even tests how an AI describes our channel, and it basically nails the vibe: a legit passion-project podcast with deep experience… while also very clearly not the unrelated, controversy-riddled “Machinima Inc” from back in the day. Check out this video –
Phil jumps in to untangle the embed drama: it may not be “AI policy” so much as an ad-delivery and revenue control move because some embedded browsers can bypass ads, and Second Life gets caught in the crossfire. Workarounds exist (including the very ironic “embed it somewhere else first” method), and Vimeo comes up as an alternative… but with price hikes that feel more “premium platform” than creator-friendly. Locked-in subscriptions, anyone?
Then it’s off to the creative tools corner: Phil’s been deep in Blender, and he’s found some very machinima developments, like a third-person controller kit that basically turns Blender into a game-like character puppeteering environment. On top of that, there’s a newly released Blender cloth-building and simulation tool that could become a budget-friendly alternative to pricey standards like Marvelous Designer – huge potential for indie creators who want great-looking outfits without a studio budget.
From there, the conversation swings to Reallusion’s latest move: Video Mocap, turning ordinary video footage into motion capture data, integrated straight into iClone’s workflow. The group talks practical realities (camera framing, background contrast, space constraints, upper-body capture modes) and why this could be a game-changer for animators who don’t have mocap suits lying around.
We also touch on Unreal Engine’s rapid evolution and its ever-improving animation tools—plus the eternal question: with tech this powerful, why aren’t we seeing more great films made with it? Check this out –
Damien drops some rock-solid creator advice: don’t try to learn new tools by making your magnum opus. Make a short “training film,” and if you switch platforms… remake it. Same story, new tech, better skills. Simple, smart, and honestly kind of brilliant.
Finally, we hit a spicy AI update: major AI music platforms (Suno and Udio) have reportedly reached settlements with record labels, meaning they’ll rework how training and licensing works going forward. That could reshape what “responsible” AI music use looks like in 2026 – and what it’ll cost creators.
And to wrap up on a lighter note, there’s a shoutout to NeuralVIZ and a fun character-driven sci-fi project, The Adventures of Remo Green, as a reminder that experimentation can still be entertaining (and weirdly impressive).
And that’s the episode: YouTube changes, creator workarounds, new animation toys, and the future of AI tools, served with equal parts curiosity and chaos.
And btw, to hear more about Ricky’s epic bus trip, check in on next week’s episode!
We love AFK! This week’s ep covers another of AFK’s ambitious world building attempts to round out the Star Wars universe… and of course it works. What do you imagine the worst jobs to be for the most hapless Stormtroopers? These totally make sense to us! Check out our comments, our overview of AFK’s background and other projects, and what we really think of this short.
YouTube Version of This Episode
Show Notes & Links
Worst Jobs in Star Wars – A ‘For the Empire’ minisode created in Unreal 5.6 by AFK, released 19 June 2025
This week, we cover a lot of things this month, as usual, but start with a tribute to a machinima pioneer who has sadly crossed the bridge into an unknown world – Tutsy NAvArAthnA. We then go through some more fascinating projects that we want to share, more about genAI and machinima, a bunch of new tools and techniques, some wise words from some great artists, latest games and relevant updates. All links below.
YouTube Version of This Episode
Show Notes & Links
Celebrating Tutsy NAvArAthAa / Basile Vignes’ Life
Character Creator 5 update and connection to Unreal –
Fender Studio release for creating audio for 8 tracks, here
Martin Bell’s Unreal tutorial series –
How one detail completely changed a scene –
A LoRA for a Pierson’s Puppeteer –
NaturalVision Enhanced – release trailer –
Game Release
Dune Awakening –
Cyberpunk 2077 –
And finally…
Fortnite definition of machinima states: ‘the use of real-time computer graphics engines, like that used with Fortnite, to create a cinematic production. The word machinima is a portmanteau of the words machine and cinema.’
Take a look at the interview Tracy did with Epic’s Chief Tech Officer Kim Libreri in Pioneers in Machinima!
Recent Comments