This week we review the winner of the 2022 Kitbash3D moving image challenge contest, based on their free Mission to Minerva asset pack. The film is called Secret Moon by Orencloud, and what a visually stunning and ethereal representation of Minerva this is, with a clear trajectory between this piece and Orencloud’s portfolio. We discuss some of the ways in which the film works and works less well for us, and note that at least one of us missed the ending!
YouTube Version of this Episode
Show Notes and Links
Secret Moon, by Orencloud, released 1 December 2022, film link –
This week’s #MondayMotivation post has some more great examples of machinima and virtual production projects. We have a selection of shorts made using Unreal Engine and another entirely made in Blender, plus a couple of ‘making ofs…’ and a ‘role of…’ also worth checking out.
An artist we’ve talked about before who has created extensive work over many years in Second Life is Bryn Oh, and now she has created a nostalgic experience called Lobby Cam which is available on Steam, made using UE5. The experience is a walking tour of an extensive environment and a story told through the pages of a ripped up diary. The project has been reviewed by James Wagner Au on his blog, New World Notes here. It is described as part of a larger narrative, and here’s a video sampler of the tour produced as part of that too… an interesting approach to virtual storytelling –
Off planet, another project which contains amazing detail of other worlds is by Melody Sheep, called The Sights of Space: A Voyage to Spectacular Alien Worlds (released 29 Nov 2022). This a 30 minutes-long film of speculative depictions of space scenes based on ‘current scientific understanding’ of the Milky Way, albeit with extensive creative license. If you ever wanted to get into a new type of documentary, this is probably the one to have on your watchlist –
We were also thrilled to see what promises to be a very interesting new series launching later this year by Melody Sheep, called The Human Future, check out the trailer on the channel here.
In our next project pick, called JOYCE by GTshortStories (released 14 Dec 2022), UE5 and every available tool with it has been used to create an interesting space story. This mixes live action with some well done animation, and the integration is done really well, so its a great example to check out. Joyce is a backchatting robot exploring a facility along with Sargeant Terry Brown – there are many references to popular sci-fi tropes, so do check this one out! GTshortStories is also putting out other creative content, so check out the channel too.
Our final space project for this week is Countdown, by Andrew Klimov on the CGChannel. This is a fast paced story of a crash landing onto an alien planet, all about the crash itself, and it certainly makes you feel it. The crash is the beginning of a new series and you can find out more about that on his website here. There’s also an interesting breakdown of the filmmaking process on his Vimeo channel here.
Our next project pick goes back to the 11th Century, inspired by an Umbrian folk tale in the novel ‘E poi si fece buio’ by Matteo Bebi. It is about a dream by Imiltrude who lived in a hidden village and was sentenced to death for having caused a fire that destroyed a city. The film HIMIL is by Tiziano Fioriti and Andrea Brunetti, made using UE5 and is a fascinating first person perspective with a very well done soundscape –
Our next project pick is a Blender-made movie and another example of great storytelling, this time in a cyberpunk environment with a really nice twist in the tale. Not sure it would be Ricky’s cup of tea, to his point about emotional representation, but I certainly loved it! The story has been created by the Blender HQ team, so its by no means an indie endeavour with a team of folks behind the processes employed but definitely worth watching – check out the pace of the action and sound design in particular. The film is called Charge – Blender Open Movie (released 15 Dec 2022) and you can access the production files and making of videos for the film here.
We always love a good homage to Star Wars, and this week we have a feature from the Reallusion Magazine, which describes how iClone and the Vicon mocap system have been used to recreate that iconic ‘I am your father’ scene from the Empire Strikes Back episode. The short has been made by Luis Cepeda from Quitasueño Studios, based in the Dominican Republic, and he provides a great step-by-step guide to how the short was made with a video overview here –
Ever wondered how to use a midi controller with UE5 that lets you use the controller for all sorts of effects in real-time just with the keyboard? Well, here’s a fantastic video tutorial for you by Taiyaki Studios featuring Cory Williams –
And finally this week, we share Loralee Sundra’s video on the Internet Archive about the value of public domain films from her perspective as a Frontline Fellow at the Documentary Film Legal Clinic at UCLA School of Law. Her talk was part of the Internet Archive’s Public Domain Day 2023 celebration, held on 25 Jan 2023.
Ricky’s pick for the month is Let’s Play Nomad X by Kristian Andrews, uploaded to Vimeo on 11 Mar 2013. The film is a ‘slice of life’ comedy with intriguing mix of playing a 90s style space game, Nomad X, and reflecting on losing the love of your life – a perfect antidote for those Valentine’s week shenanigans!
YouTube Version of the Episode
Show Notes and Links
Let’s Play Nomad X by Kristian Andrews, released 11 March 2013 –
Link to Director’s Notes interview about the film here
One of the director’s favour films, Frontier Fundamentals, Ep 1, @JimPlaysGames –
Game mentioned in the discussion, Barbara-Ian, available on Steam here
This week, we highlight some time-saving examples for generating 3D models using – you guessed it – AIs, and we also take a look at some recent developments in motion tracking for creators.
All these examples highlight that generating a 3D model isn’t the end of the process and that once its in Blender, or another animation toolset, there’s definitely more work to do. These add-ons are intended to help you reach your end result more quickly, cutting out some of the more tedious aspects of the creative process using AIs.
Blender is one of those amazing animation tools that has a very active community of users, and of course, a whole heap of folks looking for quick ways to solve challenges in their creative pipeline. We found folks that have integrated OpenAI’s ChatGPT into using the toolset by developing add-ons. Check out this illustration by Olav3D, whose comments about using ChatGPT for attempting to write Python scripts sum it up nicely, “better than search alone” –
Dreamtextures by Carson Katri is a Blender add-on using Stable Diffusion which is so clever that it even projects textures onto 3D models (with our thanks to Krad Productions for sharing this one). In this video, Default Cube talks about how to get results with as few glitches as possible –
and this short tells you how to integrate Dreamtextures into Blender, by Vertex Rage –
To check out Dreamtextures for yourself, you can find the Katri’s application on Github here and should you wish to support his work, subscribe to his Patreon channel here too.
OpenAI also launched its Point-E 3D model generator this month, which can then be imported into Blender but, as CGMatter has highlighted, using the published APIs takes a very long time sitting in cues to access the downloads, whilst downloading the code to your own machine to run it locally, well that’s easy – and once you have it, you can create point-cloud models in seconds. However, he’s running the code from Google’s CoLab, which means you can run the code in the cloud. Here’s his tutorial on how to use Point-E without the wait giving you access to your own version of the code (on Github) in CoLab –
We also found another very interesting Blender add-on, this one lets you import models from Google Maps into the toolset. The video is a little old, but the latest update of the mod on Github, version 0.6.0 (for RenderDoc 1.25 and Blender 3.4) has just released, created by Elie Michel –
We were also interested to see NVIDIA’s update at CES (in January). It announced a release for the Omniverse Launcher that supports 3D animation in Blender, with generative AIs that enhance characters’ movement and gestures, a future update to Canvas that includes 360 surround images for panoramic environments and also an AI ToyBox, that enables you to create 3D meshes from 2D inputs. Ostensibly, these tools are for creators to develop work for the metaverse and web3 applications, but we already know NVIDIA’s USD-based tools are incredibly powerful for supporting collaborative workflows including machinima and virtual production. Check out the update here and this is a nice little promo video that sums up the integrated collaborative capabilities –
As fast as the 3D modelling scene is developing, so is motion tracking. Move.ai which launched late last year, announced its pricing strategy this month at $365 for 12 months of unlimited processing of recordings – this is markerless mocap at its very best, although not so much if you want to do live mocap (no pricing strategy announced yet). Move.ai (our feature image for this article) lets you record content using a mobile phone (a couple of old iPhones). You can find out more on its new website here and here’s a fun taster, called Gorillas in the mist, with ballet and 4 iPhones, released in December by the Move.ai team –
And another app although not 3D is Face 2D Live, released by Dayream Studios – Blueprints in January. This tool allows you to live link a Face app on your iPhone or iPad to make cartoons, including with your friends also using an iPhone app, out of just about anything. It costs just $14.99 and is available on the Unreal Marketplace here. Here’s a short video example to wet your appetite – we can see a lot of silliness ensuing with this for sure!
Not necessarily machinima but for those interested in more serious facial mocap, Weta has been talking about how it developed its facial mocap processes for Avatar, using something called an ‘anatomical plausible facial system’. This is an animator centric system that captures muscle movement rather than ‘facial action coding’ which focusses on identifying emotions. Weta stated its approach leads to a wider set of facial movements being integrated into the mocapped output – we’ll no doubt see more in due course. Here’s an article on the FX Guide website which discusses the approach being taken and for a wider ranging discussion on the types of performance tracking used by the Weta team, Corridor Crew have bagged a great interview with the Avatar VFX supervisor, Eric Saindon here –
This week’s episode reviews a film by Fandom Games called Half Life 3: Honest Game Trailers, released 31 March 2015. The film is a throwback style of satire and even if you’ve not been following Valve’s Half Life developments, you’ll enjoy this. For those less in the know, HL3 is a game that never made it beyond the hype, despite its avid fan base waiting patiently (or impatiently) to play it for 20 years! We reflect on Fandom Games origins in Jimmy Wale’s Wiki technology. We also discuss our approach to ‘healthy criticism’ on the podcast.