Omniverse

Tech Update 2 (Feb 2023)

Tracy Harwood Blog February 13, 2023 Leave a reply

This week, we highlight some time-saving examples for generating 3D models using – you guessed it – AIs, and we also take a look at some recent developments in motion tracking for creators.

3D Modelling

All these examples highlight that generating a 3D model isn’t the end of the process and that once its in Blender, or another animation toolset, there’s definitely more work to do. These add-ons are intended to help you reach your end result more quickly, cutting out some of the more tedious aspects of the creative process using AIs.

Blender is one of those amazing animation tools that has a very active community of users, and of course, a whole heap of folks looking for quick ways to solve challenges in their creative pipeline. We found folks that have integrated OpenAI’s ChatGPT into using the toolset by developing add-ons. Check out this illustration by Olav3D, whose comments about using ChatGPT for attempting to write Python scripts sum it up nicely, “better than search alone” –

Dreamtextures by Carson Katri is a Blender add-on using Stable Diffusion which is so clever that it even projects textures onto 3D models (with our thanks to Krad Productions for sharing this one). In this video, Default Cube talks about how to get results with as few glitches as possible –

and this short tells you how to integrate Dreamtextures into Blender, by Vertex Rage –

To check out Dreamtextures for yourself, you can find the Katri’s application on Github here and should you wish to support his work, subscribe to his Patreon channel here too.

OpenAI also launched its Point-E 3D model generator this month, which can then be imported into Blender but, as CGMatter has highlighted, using the published APIs takes a very long time sitting in cues to access the downloads, whilst downloading the code to your own machine to run it locally, well that’s easy – and once you have it, you can create point-cloud models in seconds. However, he’s running the code from Google’s CoLab, which means you can run the code in the cloud. Here’s his tutorial on how to use Point-E without the wait giving you access to your own version of the code (on Github) in CoLab –

We also found another very interesting Blender add-on, this one lets you import models from Google Maps into the toolset. The video is a little old, but the latest update of the mod on Github, version 0.6.0 (for RenderDoc 1.25 and Blender 3.4) has just released, created by Elie Michel –

We were also interested to see NVIDIA’s update at CES (in January). It announced a release for the Omniverse Launcher that supports 3D animation in Blender, with generative AIs that enhance characters’ movement and gestures, a future update to Canvas that includes 360 surround images for panoramic environments and also an AI ToyBox, that enables you to create 3D meshes from 2D inputs. Ostensibly, these tools are for creators to develop work for the metaverse and web3 applications, but we already know NVIDIA’s USD-based tools are incredibly powerful for supporting collaborative workflows including machinima and virtual production. Check out the update here and this is a nice little promo video that sums up the integrated collaborative capabilities –

Tracking

As fast as the 3D modelling scene is developing, so is motion tracking. Move.ai which launched late last year, announced its pricing strategy this month at $365 for 12 months of unlimited processing of recordings – this is markerless mocap at its very best, although not so much if you want to do live mocap (no pricing strategy announced yet). Move.ai (our feature image for this article) lets you record content using a mobile phone (a couple of old iPhones). You can find out more on its new website here and here’s a fun taster, called Gorillas in the mist, with ballet and 4 iPhones, released in December by the Move.ai team –

And another app although not 3D is Face 2D Live, released by Dayream Studios – Blueprints in January. This tool allows you to live link a Face app on your iPhone or iPad to make cartoons, including with your friends also using an iPhone app, out of just about anything. It costs just $14.99 and is available on the Unreal Marketplace here. Here’s a short video example to wet your appetite – we can see a lot of silliness ensuing with this for sure!

Not necessarily machinima but for those interested in more serious facial mocap, Weta has been talking about how it developed its facial mocap processes for Avatar, using something called an ‘anatomical plausible facial system’. This is an animator centric system that captures muscle movement rather than ‘facial action coding’ which focusses on identifying emotions. Weta stated its approach leads to a wider set of facial movements being integrated into the mocapped output – we’ll no doubt see more in due course. Here’s an article on the FX Guide website which discusses the approach being taken and for a wider ranging discussion on the types of performance tracking used by the Weta team, Corridor Crew have bagged a great interview with the Avatar VFX supervisor, Eric Saindon here –

Tech Update 2 (Dec 2022)

Tracy Harwood Blog December 12, 2022 Leave a reply

This week, we share updates that will add to your repertoire of tools, tuts and libraries along with a bit of fighting inspriation for creating machinima and virtual production.

Just the Job!

Unreal Engine has released a FREE animation course. Their ‘starter’ course includes contributions from Disney and Reel FX and is an excellent introduction to some of the basics in UE. Thoroughly recommended, even as a refresher for those of you that already have some of the basics.

Alongside the release of UE5.1, a new KitBash3D Cyber District kit has also been released, created by David Baylis. It looks pretty impressive – read about it on their blog here.

Kitbash3D Cyber District kit

Cineshare has released a tutorial on how to create a scene that comprises a pedestrian environment, using Reallusion’s ActorCore, iClone and Nvidia Omniverse. The tutorial has also been featured on Reallusion Magazine’s site here.

Nvidia Omniverse has released Create 2022.3.0 in beta. Check out the updates on its developer forum here and watch the highlights on this video –

Libraries

We came across this amazing 3D scan library, unimaginatively called ScansLibrary, but includes a wide range of 3D and texture assets. It’s not free but relatively low cost. For example, many assets a single credit, with a 60 package of credits being $29 per month. Make sure you check out the terms!

example of a flower, ScansLibrary

We also found a fantastic sound library, Freesound.org. The library includes 10s of thousands of audio clips, samples, recording and bleeps, all released under CC licenses, free to use for non-commercial purposes. Sounds can be browsed by key words, a ‘sounds like’ question and other methods. The database has running since 2005 and is supported by its community of users and maintained by the Universitat Pompeu Fabra, Barcelona, Spain.

Freesound.org

Not really a library as such, but Altered AI is a tool that lets you change voices on your recordings, including those you directly make into the platform. Its a cloud-based service and its not free but it has a reasonably accessible pricing strategy. This is perfect if you’re an indie creator and want a bunch of voices but can’t find the actor you want! (Ricky, please close your ears to this.) The video link is a nice review by Jae Solina, JSFilmz – check it out –

Fighting Inspiration

Sifu is updating it’s fighting action game to allow for recording and playback. You can essentially create your own martial arts movies. If you’re interested in creating fight scenes then this might be something to check out.

Sifu

S3 E49 Film Review: ‘Most Precious Gift’ by Shangyu Wang (Oct 2022)

Tracy Harwood Podcast Episodes October 19, 2022 Leave a reply

This week, Damien has picked a very interesting Eastern-made alien tale. Its been beautifully shot and rendered using Omniverse, and inspired him to try some of the techniques shown. Ricky is a little more critical of the nostalgic trope. Tracy reflects on the journey of the storytelling, and the nature of what it is to be human that is the heart of the story. Phil brings Solaris into the discussion, as only Phil can. Overall, we reflect on the different styles of animation used and how influential they were. And, finally, how on earth did the producer achieve that tendril effect?!



YouTube Version of this Episode

Link to Film

Completely Machinima S2 Ep 43 Films (August 2022)

Tracy Harwood Podcast Episodes August 11, 2022 Leave a reply

In this episode, Damien, Ricky and Tracy discuss four very different films.  Damien reviews an interesting explainer on witches in The Folklore of Phasmophobia game, Ricky presents us with another of Jae Solina’s tutorials, this time on path tracing in Omniverse, Tracy selects Tiny Elden Ring – yep, its tiny! And Phil, absent due to sickness, ironically picked a satirical Zombie fest, which mixed Walking Dead ‘live action’ with machinima!  The team then discuss that approach to creating films, highlighting some of the key challenges with some more fab examples of films that have used the techniques well. 



YouTube Version of this Episode

Show Notes and Links

0:57 The Folklore of Phasmophobia | Modern Mythology, by The Digital Dream Club (released 9 January 2021)

The Folklore of Phasmophobia

9:51 NVIDIA Omniverse Machinima Path Tracing Test, by JSFilmz (21 June 2022) and a nice little article on the difference between rasterization, ray tracing and path tracing that folks might find interesting, Nvidia says real-time path tracing is on the horizon, but what is it? By Eric Frederiksen, Gamespot.com, 1 May 2022

NVIDIA Omniverse Machinima Path Tracing Test

17:33 Tiny Elden Ring | Tilt Shift, by Flurdeh (11 April 2022) and here’s Flurdeh’s list of filmmaking tools https://github.com/Flurdeh/Youtube-Resources and a post-production tutorial on the tilt shift effect tutorial, How to create Tilt-Shift / Miniature World Time-lapses, by Science Filmmaking Tips (24 Jan 2017)

Tiny Elden Ring

27:27 What a typical project Zomboid Run looks like, by Pathoze (26 Jan 2022)

What a typical project Zomboid Run looks like

31:45 Discussion: using live action with machinima footage in films, what are the challenges?

Examples mentioned –

39:11 Damien’s The Great Bug War on Machinima Expo (8 December 2014)

Damien and Kim Genly

46:12 Ricky’s reference to a 2D/3D combo – Carson Mell’s TARANTULA A-1 : Nightmares (5 August 2012), shot in  Los Angeles

TARANTULA A-1: Nightmares

48:30 Phil Tippett’s stop mo film Mad God, including live action with animation (now available on Shudder TV)

Mad God

51:20 Tutsy Navarathna’s film, A Journey into the Metaverse and an interview we did with him on the podcast in Season 1

A Journey into the Metaverse

Completely Machinima S2 Ep 35 Films & Discussion (April 2022)

Tracy Harwood Podcast Episodes April 14, 2022 2 Comments

This month, Damien leads the Completely Machinima crew with a review and discussion of Reallusion’s iClone films.  Alongside the few amazing creative projects by Warlord, Rene Jacob and Martin Klekner we discussed, finding a broad selection has been a challenge so we were interested to reflect on why there are so few iClone movies to see.



Video Version of this Episode

Show Notes & Links

2.23 The Sniper by Warlord (iClone 4), released 29 August 2009

The Sniper

14:47 Alien: The Message by direx1974/Rene Jacob (iClone 7), released 27 January 2022 (also check in on our interview with Rene Jacob)

Alien: The Message

17:25 Ricky begins the discussion on ‘what is a fan film?’

25:42 Continued discussion of ‘fan films’ and ‘fan culture’

34:20 Heroes of Bronze ‘Journeys’ by Martin Klekner (iClone 7), teaser released 4 March 2021

Heroes of Bronze

Interview with Martin Klekner on Reallusion’s blog (14 April 2021)

37:00 iClone vs Blender vs Omniverse discussion

44:48 Ben Tuttle’s short course for animators

53:42 Where is the iClone community and why aren’t there more creative projects to see?