This week covers a reading of Edgar Allen Poe’s classic poem Alone, read by Shane Morris (audio used is from the BEKNOWN channel) with visuals by Playard Studios. The film uses Unreal Engine’s metahuman and NVIDIA Omniverse ‘s Audio-to-Face and there are some impressive introspective looks achieved with the process… among a few other things we comment on, not least being Ricky’s experience of reading poetry.
YouTube Version of this Episode
Show Notes & Links
ALONE film, released on 26 October 2022
Beknown channel reading by Shane Morris on YouTube.
There are a growing number of ‘challenges’ that we’ve been finding over the last few months – many are opportunities to learn new tools or use assets created by studios such as MacInnes Studio. They are also incentivised with some great prizes, generally involving something offered by the contest organizer, such as that by Kitbash3D we link to in this post. This week, we were however light on actual live contests to call out, but have found someone who is always in the know, Winbush!
Mission to Minerva (deadline 2 Dec 2022)
Kitbash3D’s challenge is for you to contribute to the development of a new galaxy! On their website, they state: ‘Your mission, should you choose to accept, is to build a settlement on a planet within the galaxy. What will yours look like?’ Their ultimate aim is to outsource all the creatie work to their community, combining artworks contests participants submit. There are online tutorials to assist, where they show you how to use Kitbash3D in Blender and Unreal Engine 5, and your work can be either concept are or animation. Entry into the contest couldn’t be simpler: you just need to share on social media (Twitter, FB, IG, Artstation) and use the hashtag #KB3Dchallenge. Winners will be announced on 20 December and there are some great prizes, sponsored by the likes off Unreal, Nvidia, CG Spectrum, WACOM, The Gnoman Workshop, The Rookies and ArtStation (platforms). Entry details and more info here.
Pug Forest Challenge
This contest has already wrapped – but there are now a few of this type of thing emerging – challenges which give you an asset to play with for a period of time, a submission guideline process, and some fabulous prizes – all geared towards incentivising you to learn a new toolset, this one being UE5! So if you need the incentivisation to motivate you – its definitely worth looking out for these. Jonathan Winbush is also one of those folks whose tutorials are legendary in the UE5 community, so even if you don’t want to enter, this is someone to follow.
McInnes Studios’ Mood Scene Challenge
John McInnes recently announced the winners of his Mood Scene challenge contest that we reported on back in August – we must say, the winners have certainly delivered some amazing moods. Check the show reel out here –
This week we have two great films to share with you courtesy of Damien. Our main film is The Remnants by Stan Petruk, a disturbing tale of the aftermath of some global disaster created as part of Reallusion’s Pitch and Produce programme. Our bonus film is a Mobile Short treat, So Palpatine Needs Padme Dead…[LEGO Edition] by Cinematic Series Gaming. Its astonishing what can be packed into 60 seconds!
YouTube Version of this Episode
Show Notes & Links
The Remnants by Stan Petruk (released 7 June 2022)
This film has all the hallmarks of an Eastern European style that we’ve talked about before – remember Irradiation by Sava Zivkovic (S1 E22, October 2021) and The Ship by Mednios (S1 E2, March 2021)?
There’s a nice description of Stan’s pipeline to create the film and the tools he’s used on 80.lv here and his comments about using Character Creator are on Reallusion’s website here. Below is also a nice video explainer by Stan.
So Palpatine Needs Padme Dead…[LEGO Edition] by Cinematic Gaming Series (released 29 Sept 2022)
Enjoy, and as ever, feedback and suggestions welcome!
Credits – Speakers: Ricky Grove, Damien Valentine,Tracy Harwood, Phil Rice Editor/Producer: Damien Valentine Music: Scott Buckley – www.scottbuckley.com.au CC 00
This week’s pick is a 360 music video – a ‘metaverse’ video – by a creator we’ve been following all year, Jae Solina aka JSFilmz. The film has been created in UE5 and includes some nifty mocap, great dance moves and some interesting lighting effects. Hear what the team have to say about the film and format and let us have your comments too!
YouTube Version of this Episode
Show Notes and Links
Metaverse Music Video, released 10 Sept 2022 (note, the video can be viewed as a VR experience or a 360 video) – where is the Batman Easter Egg?!!!
Our discussion on Friedrich Kirschner’s immersive machinima, person2184, in THIS episode
Nightmare Puppeteer allows 360 filmmaking – check out the engine on Steam HERE
Key questions: what new language might be needed for machinima story vs experience creators to get the most out of VR/360 formats?
Credits –
Speakers: Ricky Grove, Damien Valentine, Tracy Harwood (MIA Phil Rice, courtesy of Hurricane Ian) Producer/Editor: Damien Valentine Music: Scott Buckley – www.scottbuckley.com.au CC 00
Hot on the heels of our discussion on AI generators last week, we are interested to see tools already emerging that turn text prompts into 3D objects and also film content, alongside a tool for making music too. We have no less than five interesting updates to share here – plus a potentially very useful tool for rigging the character assets you create!
Another area of rapidly developing technological advancements is mo-cap, especially in the domain of markerless which lets face it is really the only way to think about creating naturalistic movement-based content. We share two interesting updates this week.
AI Generators
Nvidia has launched an AI tool that will generate 3D objects (see video). Called GET3D (which is derived from ‘Generate Explicit Textured 3D meshes’), the tool can generate characters and other 3D objects, as explained by Isha Salian on their blog (23 Sept). The code for the tool is currently available on Github, with instructions on how to use it here.
Google Research with researchers at the University of California, Berkeley are also working on similar tools (reported in Gigazine on 30 Sept). DreamFusion uses NeRF tech to create 3D models which can be exported into 3D renderers and modeling software. You can find the tool on Github here.
DreamFusion
Meta has developed a text-to-video generator, called Make-A-Video. The tool uses a single image or can fill in between two images to create some motion. The tool currently generates five second videos which are perfect for background shots in your film. Check out the details on their website here (and sign up to their updates too). Let us know how you get on with this one too!
Make-A-Video
Runway has released a Stable Diffusion-based tool that allows creators to switch out bits of images they do not like and replace them with things they do like (reported in 80.lv on 19 Oct), called Erase and Replace. There are some introductory videos available on Runway’s YouTube channel (see below for the Introduction to the tool).
And finally, also available on Github, is Mubert, a text-to-music generator. This tool uses a Deforum Stable Diffusion colab. Described as proprietary tech, its creator provides a custom license but says anything created with it cannot be released on DSPs as your own. It can be used for free with attribution to sync with images and videos, mentioning @mubertapp and hashtag #mubert, with an option to contact them directly if a commercial license is needed.
Character Rigging
Reallusion‘s Character Creator 4.1 has launched with built in AccurRIG tech – this turns any static model into an animation ready character and also comes with cross-platform support. No doubt very useful for those assets you might want to import from any AI generators you use!
Motion Capture Developments
That every-ready multi-tool, the digital equivalent of the Swiss army knife, has come to the rescue once again: the iPhone can now be used for full body mocap in Unreal Engine 5.1, as illustrated by Jae Solina, aka JSFilmz, in his video (below). Jae has used move.ai, which is rapidly becoming the gold standard in markerless mocap tech and for which you can find a growing number of demo vids showing how detailed movement can be captured on YouTube. You can find move.ai tutorials on Vimeo here and for more details about which versions of which smart phones you can use, go to their website here – its very impressive.
Another form of mocap is the detail of the image itself. Reality Capture has launched a tool that you can use to capture yourself (or anyone else or that matter, including your best doggo buddy) and use the resulting mesh to import into Unreal’s MetaHuman. Even more impressive is that Reality Capture is free, download details from here.
We’d love to hear how you get on with any of the tools we’ve covered this week – hit the ‘talk’ button on the menu bar up top and let us know.
Recent Comments