In this episode, we review Tracy’s pick for the month: ‘The Eye: Calanthek’ by Aaron Sims, made in Unreal Engine 5 using Metahuman tech, as an early exemplar of the capabilities of the engine (released 2021).
YouTube Version of this Episode
Show Notes and Links
We discuss the eyes, the monster, the surprise and camera shots.
Time stamps
1:06 Tracy introduces ‘The Eye: Calanthek‘ by Aaron Sims, released 4 November 2021
5:56 What makes it so realistic? The eyes!
11:02 Things that break the storytelling
15:02 Does knowing the craft of filmmaking restrict creative approaches to filmmaking?
This week’s Projects Update on machinima, virtual production and content creation:
The Crow
One of the most interesting creative projects we’ve seen so far using MidJourney, a creative AI generator is the The Crow (by Glenn Marshall Neural Art). Here the generator has been used to recreate a version of the ballet performance portrayed in the short film, Painted (by Duncan McDowall and Dorotea Saykaly). Stunning to say the least and we recommend you play it at least once side-by-side against the original performance too for added insight.
We’re so impressed with the potential of AI generators, whether that’s DALL-E, MidJourney, Stable Diffusion or any of the others that are now emerging, that we’re going to dedicate a special episode of the podcast to the subject next month, so watch out for that!
Jim Henson Company
Jim Henson Company is using real-time animation on their new show, Earth to Ned. Characters are created with Unreal (its the AI in the background) but JHC has been so impressed with the workflow and no post production requirement that it is looking to use the virtual production method more. What’s interesting is the level of feedback in the process that guests experience – they are not aware of the puppeteering in the background, just the virtual actor on the screen, performing naturalistically in real-time! We’ve not seen much of this kind of machinima before although actually Hugh Hancock did some very early work on this and of course Rooster Teeth have done live performances using similar techniques. We can certainly expect to see a lot more of it, particularly for interactive theatre, VR and AR.
Half Life 3
Half Life 3 was never going to be like the originals? This article on Tech Radar is interesting: the author (Phil Iwanuik) contends the Half Life franchise remakes would never be like the originals because the extreme attention to the world of HL created so much pressure the Valve team could never live up to it. We’re not sure about that, but it’s an interesting idea.
Dune: Awakening
A very impressive MMO has launched using the Dune world, currently in beta, Dune: Awakening. Here’s the trailer – we’re looking forward to seeing machinima made with this –
Dungeons & Dragons?
What does Dungeons and Dragons, typically a game played around a table, have to do with machinima? There’s been a rise in popularity of web based shows where people play the game and act out scenes. This group (Corridor Crew) is using Unreal Engine 5 for virtual production (not quite The Mandalorian but sort of similar) to put their actors, real-time, into the environments of their adventure. Check it out here –
This week, we share feedback on our reviews from September’s selection and discuss a video about learning production techniques, selected by Phil: ‘How I learned Unity without following tutorials’ by Mark Brown.
Youtube Version of this Episode
Show Notes & Links
0:47 Ricky highlights some of the news items we’re released in a series of blogs this month, being released on Mondays, starting with out Tech Update on 3 Oct, then Projects Update on 10 Oct, Fests & Contests Update on 17 Oct and finally a Report on Creative AI Generators on 24 Oct. You can find these on https://CompletelyMachinima.com/Blog
6:51 How I learned Unity without following tutorials (Developing 1) by Mark Brown on his Game Maker’s Toolkit channel, released 28 September 2021 – discussion, learning by rote vs learning concepts, different learning styles and role of prototyping in machinima and virtual production.
Q: What are your favourite tutorials on machinima making? Get in touch and let us know on our email or other channels
This week’s Tech Update picks for machinima, virtual production and 3D content producers:
Nvidia RTX4080
Nvidia is launching two RTX 4080 graphics cards in November… you know what they say, you wait ages for a bus and then two come at once: the RTX 4080 12GB and RTX 4080 16GB. Here’s the story on PC Gamer‘s website. You can also catch up on all latest Nvidia’s announcements made in Jensen Huang’s (CEO) keyote at GTC in September in this video and on their blog here.
Ricky comments: Of course it was only a matter of time before NVidia announced the 40x series of RTX graphics cards. Two models have been announced so far, the 4080 and the 4090, with the 30x series sticking around for the lower price range. My guess is so they can focus their resources on producing more of just two high end cards instead of a whole range. Although given the prices of these new cards ($800+), I think I’ll be sticking with my 3070 for the time being.
UE 5.1.0
Unreal Engine have teased the new features coming to V5.1.0 – see the features documentation on their website here. Onsetfacilities.com has produced a nice overview – link here – and a nice explainer by JSFilmz here –
Cine Tracer
Check out the new Actor Animation system in Cine Tracer v0.7.6. This update gives the Actors a set of talking animations that can be used as an alternative to the Posing system.
Follow the socials on Instagram and download Cine Tracer on Steam
Sketchfab
Sketchfabis doing a weekly listing of top cultural heritage and history models – these are actually pretty amazing and of course downloadable (for a fee)!
DALL-E
DALL-E, one of the creative AI generators that is all the buzz at the moment, has developed a new feature called Outpainting which can help users extend an image beyond its original borders by adding visual elements in the same style, or taking a story in new directions. This could be great for background shots in virtual productions.
Second Life
Second Life have launched a puppetry project for their avatars. As Wagner James Au reports in his regular blog on all things metaverse and Second Life in particular, this is using a webcam and mocap. Check out Au’s review of it here and go directly to Second Life here to read their post about it and follow their channel on YouTube for latest updates and how-tos here.
Eleven Labs
Eleven Labs have launched Voice Conversion. This lets you transform one person’s voice into another’s. It uses a process called voice cloning to encode the target voice – ie, the voice we convert to – to generate the same message spoken in a way which matches the target speaker’s identity but preserves the original intonation. What’s interesting about this is the filmmaking potential but of course there are very clearly IP interests that have to be considered here – it has potential for machinima application but beware the guidelines on using it. Importantly, note that it is primarily going to be used as part of an identity-preserving automatic dubbing tool which Eleven is launching in 2023. More here on this and the guidlines on using Voice Conversion.
Recent Comments